Open In Colab

LICENSING NOTICE¶

Note that all users who use Vital DB, an open biosignal dataset, must agree to the Data Use Agreement below. If you do not agree, please close this window. The Data Use Agreement is available here: https://vitaldb.net/dataset/#h.vcpgs1yemdb5

This is the development version of the project code¶

For the Project Draft submission see the DL4H_Team_24_Project_Draft.ipynb notebook in the project repository.

Project repository¶

The project repository can be found at: https://github.com/abarrie2/cs598-dlh-project

Introduction¶

This project aims to reproduce findings from the paper titled "Predicting intraoperative hypotension using deep learning with waveforms of arterial blood pressure, electroencephalogram, and electrocardiogram: Retrospective study" by Jo Y-Y et al. (2022) [1]. This study introduces a deep learning model that predicts intraoperative hypotension (IOH) events before they occur, utilizing a combination of arterial blood pressure (ABP), electroencephalogram (EEG), and electrocardiogram (ECG) signals.

Background of the Problem¶

Intraoperative hypotension (IOH) is a common and significant surgical complication defined by a mean arterial pressure drop below 65 mmHg. It is associated with increased risks of myocardial infarction, acute kidney injury, and heightened postoperative mortality. Effective prediction and timely intervention can substantially enhance patient outcomes.

Evolution of IOH Prediction¶

Initial attempts to predict IOH primarily used arterial blood pressure (ABP) waveforms. A foundational study by Hatib F et al. (2018) titled "Machine-learning Algorithm to Predict Hypotension Based on High-fidelity Arterial Pressure Waveform Analysis" [2] showed that machine learning could forecast IOH events using ABP with reasonable accuracy. This finding spurred further research into utilizing various physiological signals for IOH prediction.

Subsequent advancements included the development of the Acumen™ hypotension prediction index, which was studied in "AcumenTM hypotension prediction index guidance for prevention and treatment of hypotension in noncardiac surgery: a prospective, single-arm, multicenter trial" by Bao X et al. (2024) [3]. This trial integrated a hypotension prediction index into blood pressure monitoring equipment, demonstrating its effectiveness in reducing the number and duration of IOH events during surgeries. Further study is needed to determine whether this resultant reduction in IOH events transalates into improved postoperative patient outcomes.

Current Study¶

Building on these advancements, the paper by Jo Y-Y et al. (2022) proposes a deep learning approach that enhances prediction accuracy by incorporating EEG and ECG signals along with ABP. This multi-modal method, evaluated over prediction windows of 3, 5, 10, and 15 minutes, aims to provide a comprehensive physiological profile that could predict IOH more accurately and earlier. Their results indicate that the combination of ABP and EEG significantly improves performance metrics such as AUROC and AUPRC, outperforming models that use fewer signals or different combinations.

Our project seeks to reproduce and verify Jo Y-Y et al.'s results to assess whether this integrated approach can indeed improve IOH prediction accuracy, thereby potentially enhancing surgical safety and patient outcomes.

Scope of Reproducibility:¶

The original paper investigated the following hypotheses:

  1. Hypothesis 1: A model using ABP and ECG will outperform a model using ABP alone in predicting IOH.
  2. Hypothesis 2: A model using ABP and EEG will outperform a model using ABP alone in predicting IOH.
  3. Hypothesis 3: A model using ABP, EEG, and ECG will outperform a model using ABP alone in predicting IOH.

Results were compared using AUROC and AUPRC scores. Based on the results described in the original paper, we expect that Hypothesis 2 will be confirmed, and that Hypotheses 1 and 3 will not be confirmed.

In order to perform the corresponding experiments, we will implement a CNN-based model that can be configured to train and infer using the following four model variations:

  1. ABP data alone
  2. ABP and ECG data
  3. ABP and EEG data
  4. ABP, ECG, and EEG data

We will measure the performance of these configurations using the same AUROC and AUPRC metrics as used in the original paper. To test hypothesis 1 we will compare the AUROC and AUPRC measures between model variation 1 and model variation 2. To test hypothesis 2 we will compare the AUROC and AUPRC measures between model variation 1 and model variation 3. To test hypothesis 3 we will compare the AUROC and AUPRC measures between model variation 1 and model variation 4. For all of the above measures and experiment combinations, we will operate multiple experiments where the time-to-IOH event prediction will use the following prediction windows:

  1. 3 minutes before event
  2. 5 minutes before event
  3. 10 minutes before event
  4. 15 minutes before event

In the event that we are compute-bound, we will prioritize the 3-minute prediction window experiments as they are the most relevant to the original paper's findings.

The predictive power of ABP, ECG and ABP + ECG models at 3-, 5-, 10- and 15-minute prediction windows: Predictive power of ABP, ECG and ABP + ECG models at 3-, 5-, 10- and 15-minute prediction windows

Modifications made for demo mode¶

In order to demonstrate the functioning of the code in a short (ie, <8 minute limit) the following options and modifications were used:

  1. MAX_CASES was set to 20. The total number of cases to be used in the full training set is 3296, but the smaller numbers allows demonstration of each section of the pipeline.
  2. vitaldb_cache is prepopulated in Google Colab. The cache file is approx. 800MB and contains the raw and mini-fied copies of the source dataset and is downloaded from Google Drive. This is much faster than using the vitaldb API, but is again only a fraction of the data. The full dataset can be downloaded with the API or prepopulated by following the instructions in the "Bulk Data Download" section below.
  3. max_epochs is set to 6. With the small dataset, training is fast and shows the decreasing training and validation losses. In the full model run, max_epochs will be set to 100. In both cases early stopping is enabled and will stop training if the validation losses stop decreasing for five consecutive epochs.
  4. Only the "ABP + EEG" combination will be run. In the final report, additional combinations will be run, as discussed later.
  5. Only the 3-minute prediction window will be run. In the final report, additional prediction windows (5, 10 and 15 minutes) will be run, as discussed later.
  6. No ablations are run in the demo. These will be completed for the final report.

Methodology¶

Methodology from Final Rubrik¶

  • Environment
    • Python version
    • Dependencies/packages needed
  • Data
    • Data download instruction
    • Data descriptions with helpful charts and visualizations
    • Preprocessing code + command
  • Model
    • Citation to the original paper
    • Link to the original paper’s repo (if applicable)
    • Model descriptions
    • Implementation code
    • Pretrained model (if applicable)
  • Training
    • Hyperparams
      • Report at least 3 types of hyperparameters such as learning rate, batch size, hidden size, dropout
    • Computational requirements
      • Report at least 3 types of requirements such as type of hardware, average runtime for each epoch, total number of trials, GPU hrs used, # training epochs
      • Training code
  • Evaluation
    • Metrics descriptions
    • Evaluation code

The methodology section is composed of the following subsections: Environment, Data and Model.

  • Environment: This section describes the setup of the environment, including the installation of necessary libraries and the configuration of the runtime environment.
  • Data: This section describes the dataset used in the study, including its collection and preprocessing.
    • Data Collection: This section describes the process of downloading the dataset from VitalDB and populating the local data cache.
    • Data Preprocessing: This section describes the preprocessing steps applied to the dataset, including data selection, data cleaning, and feature extraction.
  • Model: This section describes the deep learning model used in the study, including its implementation, training, and evaluation.
    • Model Implementation: This section describes the implementation of the deep learning model, including the architecture, loss function, and optimization algorithm.
    • Model Training: This section describes the training process, including the training loop, hyperparameters, and training strategy.
    • Model Evaluation: This section describes the evaluation process, including the metrics used, the evaluation strategy, and the results obtained.

Environment¶

Create environment¶

The environment setup differs based on whether you are running the code on a local machine or on Google Colab. The following sections provide instructions for setting up the environment in each case.

Local machine¶

Create conda environment for the project using the environment.yml file:

conda env create --prefix .envs/dlh-team24 -f environment.yml

Activate the environment with:

conda activate .envs/dlh-team24

Google Colab¶

The following code snippet installs the required packages and downloads the necessary files in a Google Colab environment:

In [1]:
# Google Colab environments have a `/content` directory. Use this as a proxy for running Colab-only code
COLAB_ENV = "google.colab" in str(get_ipython())
if COLAB_ENV:
    #install vitaldb
    %pip install vitaldb

    # Executing in Colab therefore download cached preprocessed data.
    # TODO: Integrate this with the setup local cache data section below.
    # Check for file existence before overwriting.
    import gdown
    gdown.download(id="15b5Nfhgj3McSO2GmkVUKkhSSxQXX14hJ", output="vitaldb_cache.tgz")
    !tar -zxf vitaldb_cache.tgz

    # Download sqi_filter.csv from github repo
    !wget https://raw.githubusercontent.com/abarrie2/cs598-dlh-project/main/sqi_filter.csv

All other required packages are already installed in the Google Colab environment.

Load environment¶

In [2]:
# Import packages
import os
import random
import sys
import uuid
import copy
from collections import defaultdict

from timeit import default_timer as timer

import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
from sklearn.manifold import TSNE
from sklearn.model_selection import train_test_split
from sklearn.metrics import classification_report, roc_auc_score, precision_recall_curve, auc, confusion_matrix
from sklearn.metrics import RocCurveDisplay, PrecisionRecallDisplay, average_precision_score
from sklearn.preprocessing import StandardScaler
from sklearn.neighbors import KNeighborsClassifier
import torch
from torch.utils.data import Dataset
import vitaldb
import h5py

import torch.nn as nn
import torch.nn.functional as F
from tqdm import tqdm
from datetime import datetime
In [3]:
global_time_start = timer()

Set random seeds to generate consistent results:

In [4]:
RANDOM_SEED = 42

def reset_random_state():
    random.seed(RANDOM_SEED)
    np.random.seed(RANDOM_SEED)
    torch.manual_seed(RANDOM_SEED)
    if torch.cuda.is_available():
        torch.cuda.manual_seed(RANDOM_SEED)
        torch.cuda.manual_seed_all(RANDOM_SEED)
        torch.backends.cudnn.deterministic = True
        torch.backends.cudnn.benchmark = False
    os.environ["PYTHONHASHSEED"] = str(RANDOM_SEED)
    
reset_random_state()

Set device to GPU or MPS if available

In [5]:
device = torch.device("cuda" if torch.cuda.is_available() else "mps" if (torch.backends.mps.is_available() and torch.backends.mps.is_built()) else "cpu")
print(f"Using device: {device}")
Using device: mps

Define class to print to console and simultaneously save to file:

In [6]:
class ForkedStdout:
    def __init__(self, file_path):
        self.file = open(file_path, 'w')
        self.stdout = sys.stdout

    def write(self, message):
        self.stdout.write(message)
        self.file.write(message)

    def flush(self):
        self.stdout.flush()
        self.file.flush()

    def __enter__(self):
        sys.stdout = self

    def __exit__(self, exc_type, exc_val, exc_tb):
        sys.stdout = self.stdout
        self.file.close()

Data¶

Data Description¶

Source¶

Data for this project is sourced from the open biosignal VitalDB dataset as described in "VitalDB, a high-fidelity multi-parameter vital signs database in surgical patients" by Lee H-C et al. (2022) [4], which contains perioperative vital signs and numerical data from 6,388 cases of non-cardiac (general, thoracic, urological, and gynecological) surgery patients who underwent routine or emergency surgery at Seoul National University Hospital between 2016 and 2017. The dataset includes ABP, ECG, and EEG signals, as well as other physiological data. The dataset is available through an API and Python library, and at PhysioNet: https://physionet.org/content/vitaldb/1.0.0/

Statistics¶

Characteristics of the dataset: | Characteristic | Value | Details | |-----------------------|-----------------------------|------------------------| | Total number of cases | 6,388 | | | Sex (male) | 3,243 (50.8%) | | | Age (years) | 59 | Range: 48-68 | | Height (cm) | 162 | Range: 156-169 | | Weight (kg) | 61 | Range: 53-69 | | Tram-Rac 4A tracks | 6,355 (99.5%) | Sampling rate: 500Hz | | BIS Vista tracks | 5,566 (87.1%) | Sampling rate: 128Hz | | Case duration (min) | 189 | Range: 27-1041 |

Labels are only known after processing the data. In the original paper, there were an average of 1.6 IOH events per case and 5.7 non-events per case so we expect approximately 10,221 IOH events and 364,116 non-events in the dataset.

Data Processing¶

Data will be processed as follows:

  1. Load the dataset from VitalDB, or from a local cache if previously downloaded.
  2. Apply the inclusion and exclusion selection criteria to filter the dataset according to surgery metadata.
  3. Generate a minified dataset by discarding all tracks except ABP, ECG, and EEG.
  4. Preprocess the data by applying band-pass and z-score normalization to the ECG and EEG signals, and filtering out ABP signals below a Signal Quality Index (SQI) threshold.
  5. Generate event and non-event samples by extracting 60-second segments around IOH events and non-events.
  6. Split the dataset into training, validation, and test sets with a 6:1:3 ratio, ensuring that samples from a single case are not split across different sets to avoid data leakage.

Set Up Local Data Caches¶

VitalDB data is static, so local copies can be stored and reused to avoid expensive downloads and to speed up data processing.

The default directory defined below is in the project .gitignore file. If this is modified, the new directory should also be added to the project .gitignore.

In [7]:
VITALDB_CACHE = './vitaldb_cache'
VITAL_ALL = f"{VITALDB_CACHE}/vital_all"
VITAL_MINI = f"{VITALDB_CACHE}/vital_mini"
VITAL_METADATA = f"{VITALDB_CACHE}/metadata"
VITAL_MODELS = f"{VITALDB_CACHE}/models"
VITAL_RUNS = f"{VITALDB_CACHE}/runs"
VITAL_PREPROCESS_SCRATCH = f"{VITALDB_CACHE}/data_scratch"
VITAL_EXTRACTED_SEGMENTS = f"{VITALDB_CACHE}/segments"
In [8]:
TRACK_CACHE = None
SEGMENT_CACHE = None

# when USE_MEMORY_CACHING is enabled, track data will be persisted in an in-memory cache. Not useful once we have already pre-extracted all event segments
# DON'T USE: Stores items in memory that are later not used. Causes OOM on segment extraction.
USE_MEMORY_CACHING = False

# When RESET_CACHE is set to True, it will ensure the TRACK_CACHE is disposed and recreated when we do dataset initialization.
# Use as a shortcut to wiping cache rather than restarting kernel
RESET_CACHE = False

PREDICTION_WINDOW = 3
#PREDICTION_WINDOW = 'ALL'

ALL_PREDICTION_WINDOWS = [3, 5, 10, 15]

# Maximum number of cases of interest for which to download data.
# Set to a small value (ex: 20) for demo purposes, else set to None to disable and download and process all.
MAX_CASES = None
#MAX_CASES = 100

# Preloading Cases: when true, all matched cases will have the _mini tracks extracted and put into in-mem dict
PRELOADING_CASES = False
PRELOADING_SEGMENTS = True
# Perform Data Preprocessing: do we want to take the raw vital file and extract segments of interest for training?
PERFORM_DATA_PREPROCESSING = False
In [9]:
if not os.path.exists(VITALDB_CACHE):
  os.mkdir(VITALDB_CACHE)
if not os.path.exists(VITAL_ALL):
  os.mkdir(VITAL_ALL)
if not os.path.exists(VITAL_MINI):
  os.mkdir(VITAL_MINI)
if not os.path.exists(VITAL_METADATA):
  os.mkdir(VITAL_METADATA)
if not os.path.exists(VITAL_MODELS):
  os.mkdir(VITAL_MODELS)
if not os.path.exists(VITAL_RUNS):
  os.mkdir(VITAL_RUNS)
if not os.path.exists(VITAL_PREPROCESS_SCRATCH):
  os.mkdir(VITAL_PREPROCESS_SCRATCH)
if not os.path.exists(VITAL_EXTRACTED_SEGMENTS):
  os.mkdir(VITAL_EXTRACTED_SEGMENTS)

print(os.listdir(VITALDB_CACHE))
['segments_filter_neg', 'segments_bak', 'runs_old', 'runs_03_15_parameter_tuning', 'segments_bak_0505', '.DS_Store', 'segments_filter_neg_pos', 'vital_mini_bak_0501', 'vital_all', 'segments_sizes_sp.txt', 'models_all_cases_baseline', 'segments_golden', 'models', 'docs', 'vital_mini.tar', 'data_scratch', 'segments_md5_sp.txt', 'vital_file_md5_mw.txt', 'segments_bak_0501', 'osfs', 'runs_03_15', 'vital_mini', 'segments_filter_none', 'vital_file_mini_md5_sp.txt', 'vital_file_mini_file_sizes_sp.txt', 'runs', 'metadata', 'segments', 'models_old', 'vital_file_md5_sp.txt']

Bulk Data Download¶

This step is not required, but will significantly speed up downstream processing and avoid a high volume of API requests to the VitalDB web site.

The cache population code checks if the .vital files are locally available, and can be populated by calling the vitaldb API or by manually prepopulating the cache (recommended)

  • Manually downloaded the dataset from the following site: https://physionet.org/content/vitaldb/1.0.0/
    • Download the zip file in a browser, or
    • Use wget -r -N -c -np https://physionet.org/files/vitaldb/1.0.0/ to download the files in a terminal
  • Move the contents of vital_files into the ${VITAL_ALL} directory.
In [10]:
# Returns the Pandas DataFrame for the specified dataset.
#   One of 'cases', 'labs', or 'trks'
# If the file exists locally, create and return the DataFrame.
# Else, download and cache the csv first, then return the DataFrame.
def vitaldb_dataframe_loader(dataset_name):
    if dataset_name not in ['cases', 'labs', 'trks']:
        raise ValueError(f'Invalid dataset name: {dataset_name}')
    file_path = f'{VITAL_METADATA}/{dataset_name}.csv'
    if os.path.isfile(file_path):
        print(f'{dataset_name}.csv exists locally.')
        df = pd.read_csv(file_path)
        return df
    else:
        print(f'downloading {dataset_name} and storing in the local cache for future reuse.')
        df = pd.read_csv(f'https://api.vitaldb.net/{dataset_name}')
        df.to_csv(file_path, index=False)
        return df

Exploratory Data Analysis¶

Cases¶

In [11]:
cases = vitaldb_dataframe_loader('cases')
cases = cases.set_index('caseid')
cases.shape
cases.csv exists locally.
Out[11]:
(6388, 73)
In [12]:
cases.index.nunique()
Out[12]:
6388
In [13]:
cases.head()
Out[13]:
subjectid casestart caseend anestart aneend opstart opend adm dis icu_days ... intraop_colloid intraop_ppf intraop_mdz intraop_ftn intraop_rocu intraop_vecu intraop_eph intraop_phe intraop_epi intraop_ca
caseid
1 5955 0 11542 -552 10848.0 1668 10368 -236220 627780 0 ... 0 120 0.0 100 70 0 10 0 0 0
2 2487 0 15741 -1039 14921.0 1721 14621 -221160 1506840 0 ... 0 150 0.0 0 100 0 20 0 0 0
3 2861 0 4394 -590 4210.0 1090 3010 -218640 40560 0 ... 0 0 0.0 0 50 0 0 0 0 0
4 1903 0 20990 -778 20222.0 2522 17822 -201120 576480 1 ... 0 80 0.0 100 100 0 50 0 0 0
5 4416 0 21531 -1009 22391.0 2591 20291 -67560 3734040 13 ... 0 0 0.0 0 160 0 10 900 0 2100

5 rows × 73 columns

In [14]:
cases['sex'].value_counts()
Out[14]:
sex
M    3243
F    3145
Name: count, dtype: int64

Tracks¶

In [15]:
trks = vitaldb_dataframe_loader('trks')
trks = trks.set_index('caseid')
trks.shape
trks.csv exists locally.
Out[15]:
(486449, 2)
In [16]:
trks.index.nunique()
Out[16]:
6388
In [17]:
trks.groupby('caseid')[['tid']].count().plot();
In [18]:
trks.groupby('caseid')[['tid']].count().hist();
In [19]:
trks.groupby('tname').count().sort_values(by='tid', ascending=False)
Out[19]:
tid
tname
Solar8000/HR 6387
Solar8000/PLETH_SPO2 6386
Solar8000/PLETH_HR 6386
Primus/CO2 6362
Primus/PAMB_MBAR 6361
... ...
Orchestra/AMD_VOL 1
Solar8000/ST_V5 1
Orchestra/NPS_VOL 1
Orchestra/AMD_RATE 1
Orchestra/VEC_VOL 1

196 rows × 1 columns

Parameters of Interest¶

Hemodynamic Parameters Reference¶

https://vitaldb.net/dataset/?query=overview#h.f7d712ycdpk2

SNUADC/ART

arterial blood pressure waveform

Parameter, Description, Type/Hz, Unit

SNUADC/ART, Arterial pressure wave, W/500, mmHg

In [20]:
trks[trks['tname'].str.contains('SNUADC/ART')].shape
Out[20]:
(3645, 2)

SNUADC/ECG_II

electrocardiogram waveform

Parameter, Description, Type/Hz, Unit

SNUADC/ECG_II, ECG lead II wave, W/500, mV

In [21]:
trks[trks['tname'].str.contains('SNUADC/ECG_II')].shape
Out[21]:
(6355, 2)

BIS/EEG1_WAV

electroencephalogram waveform

Parameter, Description, Type/Hz, Unit

BIS/EEG1_WAV, EEG wave from channel 1, W/128, uV

In [22]:
trks[trks['tname'].str.contains('BIS/EEG1_WAV')].shape
Out[22]:
(5871, 2)

Cases of Interest¶

These are the subset of case ids for which modelling and analysis will be performed based upon inclusion criteria and waveform data availability.

In [23]:
# TRACK NAMES is used for metadata analysis via API
TRACK_NAMES = ['SNUADC/ART', 'SNUADC/ECG_II', 'BIS/EEG1_WAV']
TRACK_SRATES = [500, 500, 128]
# EXTRACTION TRACK NAMES adds the EVENT track which is only used when doing actual file i/o
EXTRACTION_TRACK_NAMES = ['SNUADC/ART', 'SNUADC/ECG_II', 'BIS/EEG1_WAV', 'EVENT']
EXTRACTION_TRACK_SRATES = [500, 500, 128, 1]
In [24]:
# As in the paper, select cases which meet the following criteria:
#
# For patients, the inclusion criteria were as follows:
# (1) adults (age >= 18)
# (2) administered general anaesthesia
# (3) undergone non-cardiac surgery. 
#
# For waveform data, the inclusion criteria were as follows:
# (1) no missing monitoring for ABP, ECG, and EEG waveforms
# (2) no cases containing false events or non-events due to poor signal quality
#     (checked in second stage of data preprocessing)

# Adult
inclusion_1 = cases.loc[cases['age'] >= 18].index
print(f'{len(cases)-len(inclusion_1)} cases excluded, {len(inclusion_1)} remaining due to age criteria')

# General Anesthesia
inclusion_2 = cases.loc[cases['ane_type'] == 'General'].index
print(f'{len(cases)-len(inclusion_2)} cases excluded, {len(inclusion_2)} remaining due to anesthesia criteria')

# Non-cardiac surgery
inclusion_3 = cases.loc[
    ~cases['opname'].str.contains("cardiac", case=False)
    & ~cases['opname'].str.contains("aneurysmal", case=False)
].index
print(f'{len(cases)-len(inclusion_3)} cases excluded, {len(inclusion_3)} remaining due to non-cardiac surgery criteria')

# ABP, ECG, EEG waveforms
inclusion_4 = trks.loc[trks['tname'].isin(TRACK_NAMES)].index.value_counts()
inclusion_4 = inclusion_4[inclusion_4 == len(TRACK_NAMES)].index
print(f'{len(cases)-len(inclusion_4)} cases excluded, {len(inclusion_4)} remaining due to missing waveform data')

# SQI filter
# NOTE: this depends on a sqi_filter.csv generated by external processing
inclusion_5 = pd.read_csv('sqi_filter.csv', header=None, names=['caseid','sqi']).set_index('caseid').index
print(f'{len(cases)-len(inclusion_5)} cases excluded, {len(inclusion_5)} remaining due to SQI threshold not being met')

# Only include cases with known good waveforms.
exclusion_6 = pd.read_csv('malformed_tracks_filter.csv', header=None, names=['caseid']).set_index('caseid').index
inclusion_6 = cases.index.difference(exclusion_6)
print(f'{len(cases)-len(inclusion_6)} cases excluded, {len(inclusion_6)} remaining due to malformed waveforms')

cases_of_interest_idx = inclusion_1 \
    .intersection(inclusion_2) \
    .intersection(inclusion_3) \
    .intersection(inclusion_4) \
    .intersection(inclusion_5) \
    .intersection(inclusion_6)

cases_of_interest = cases.loc[cases_of_interest_idx]

print()
print(f'{cases_of_interest_idx.shape[0]} out of {cases.shape[0]} total cases remaining after exclusions applied')

# Trim cases of interest to MAX_CASES
if MAX_CASES:
    cases_of_interest_idx = cases_of_interest_idx[:MAX_CASES]
print(f'{cases_of_interest_idx.shape[0]} cases of interest selected')
57 cases excluded, 6331 remaining due to age criteria
345 cases excluded, 6043 remaining due to anesthesia criteria
14 cases excluded, 6374 remaining due to non-cardiac surgery criteria
3019 cases excluded, 3369 remaining due to missing waveform data
0 cases excluded, 6388 remaining due to SQI threshold not being met
533 cases excluded, 5855 remaining due to malformed waveforms

2763 out of 6388 total cases remaining after exclusions applied
2763 cases of interest selected
In [25]:
cases_of_interest.head(n=5)
Out[25]:
subjectid casestart caseend anestart aneend opstart opend adm dis icu_days ... intraop_colloid intraop_ppf intraop_mdz intraop_ftn intraop_rocu intraop_vecu intraop_eph intraop_phe intraop_epi intraop_ca
caseid
1 5955 0 11542 -552 10848.0 1668 10368 -236220 627780 0 ... 0 120 0.0 100 70 0 10 0 0 0
4 1903 0 20990 -778 20222.0 2522 17822 -201120 576480 1 ... 0 80 0.0 100 100 0 50 0 0 0
7 5124 0 15770 477 14817.0 3177 14577 -154320 623280 3 ... 0 0 0.0 0 120 0 0 0 0 0
10 2175 0 20992 -1743 21057.0 2457 19857 -220740 3580860 1 ... 0 90 0.0 0 110 0 20 500 0 600
12 491 0 31203 -220 31460.0 5360 30860 -208500 1519500 4 ... 200 100 0.0 100 70 0 20 0 0 3300

5 rows × 73 columns

Tracks of Interest¶

These are the subset of tracks (waveforms) for the cases of interest identified above.

In [26]:
# A single case maps to one or more waveform tracks. Select only the tracks required for analysis.
trks_of_interest = trks.loc[cases_of_interest_idx][trks.loc[cases_of_interest_idx]['tname'].isin(TRACK_NAMES)]
trks_of_interest.shape
Out[26]:
(8289, 2)
In [27]:
trks_of_interest.head(n=5)
Out[27]:
tname tid
caseid
1 BIS/EEG1_WAV 0aa685df768489a18a5e9f53af0d83bf60890c73
1 SNUADC/ART 724cdd7184d7886b8f7de091c5b135bd01949959
1 SNUADC/ECG_II 8c9161aaae8cb578e2aa7b60f44234d98d2b3344
4 BIS/EEG1_WAV 1b4c2379be3397a79d3787dd810190150dc53f27
4 SNUADC/ART e28777c4706fe3a5e714bf2d91821d22d782d802
In [28]:
trks_of_interest_idx = trks_of_interest.set_index('tid').index
trks_of_interest_idx.shape
Out[28]:
(8289,)

Build Tracks Cache for Local Processing¶

Tracks data are large and therefore expensive to download every time used. By default, the .vital file format stores all tracks for each case internally. Since only select tracks per case are required, each .vital file can be further reduced by discarding the unused tracks.

In [29]:
# Ensure the full vital file dataset is available for cases of interest.
count_downloaded = 0
count_present = 0

#for i, idx in enumerate(cases.index):
for idx in cases_of_interest_idx:
    full_path = f'{VITAL_ALL}/{idx:04d}.vital'
    if not os.path.isfile(full_path):
        print(f'Missing vital file: {full_path}')
        # Download and save the file.
        vf = vitaldb.VitalFile(idx)
        vf.to_vital(full_path)
        count_downloaded += 1
    else:
        count_present += 1

print()
print(f'Count of cases of interest:           {cases_of_interest_idx.shape[0]}')
print(f'Count of vital files downloaded:      {count_downloaded}')
print(f'Count of vital files already present: {count_present}')
Count of cases of interest:           2763
Count of vital files downloaded:      0
Count of vital files already present: 2763

Validate Mini Files¶

In [30]:
# Convert vital files to "mini" versions including only the subset of tracks defined in TRACK_NAMES above.
# Only perform conversion for the cases of interest.
# NOTE: If this cell is interrupted, it can be restarted and will continue where it left off.
count_minified = 0
count_present = 0
count_missing_tracks = 0
count_not_fixable = 0

# If set to true, local mini files are checked for all tracks even if the mini file is already present.
FORCE_VALIDATE = False

for idx in cases_of_interest_idx:
    full_path = f'{VITAL_ALL}/{idx:04d}.vital'
    mini_path = f'{VITAL_MINI}/{idx:04d}_mini.vital'

    if FORCE_VALIDATE or not os.path.isfile(mini_path):
        print(f'Creating mini vital file: {idx}')
        vf = vitaldb.VitalFile(full_path, EXTRACTION_TRACK_NAMES)
        
        if len(vf.get_track_names()) != 4:
            print(f'Missing track in vital file: {idx}, {set(EXTRACTION_TRACK_NAMES).difference(set(vf.get_track_names()))}')
            count_missing_tracks += 1
            
            # Attempt to download from VitalDB directly and see if missing tracks are present.
            vf = vitaldb.VitalFile(idx, EXTRACTION_TRACK_NAMES)
            
            if len(vf.get_track_names()) != 4:
                print(f'Unable to fix missing tracks: {idx}')
                count_not_fixable += 1
                continue
                
            if vf.get_track_samples(EXTRACTION_TRACK_NAMES[0], 1/EXTRACTION_TRACK_SRATES[0]).shape[0] == 0:
                print(f'Empty track: {idx}, {EXTRACTION_TRACK_NAMES[0]}')
                count_not_fixable += 1
                continue
                
            if vf.get_track_samples(EXTRACTION_TRACK_NAMES[1], 1/EXTRACTION_TRACK_SRATES[1]).shape[0] == 0:
                print(f'Empty track: {idx}, {EXTRACTION_TRACK_NAMES[1]}')
                count_not_fixable += 1
                continue
                
            if vf.get_track_samples(EXTRACTION_TRACK_NAMES[2], 1/EXTRACTION_TRACK_SRATES[2]).shape[0] == 0:
                print(f'Empty track: {idx}, {EXTRACTION_TRACK_NAMES[2]}')
                count_not_fixable += 1
                continue

            if vf.get_track_samples(EXTRACTION_TRACK_NAMES[3], 1/EXTRACTION_TRACK_SRATES[3]).shape[0] == 0:
                print(f'Empty track: {idx}, {EXTRACTION_TRACK_NAMES[3]}')
                count_not_fixable += 1
                continue

        vf.to_vital(mini_path)
        count_minified += 1
    else:
        count_present += 1

print()
print(f'Count of cases of interest:           {cases_of_interest_idx.shape[0]}')
print(f'Count of vital files minified:        {count_minified}')
print(f'Count of vital files already present: {count_present}')
print(f'Count of vital files missing tracks:  {count_missing_tracks}')
print(f'Count of vital files not fixable:     {count_not_fixable}')
Count of cases of interest:           2763
Count of vital files minified:        0
Count of vital files already present: 2763
Count of vital files missing tracks:  0
Count of vital files not fixable:     0

Filtering¶

Preprocessing characteristics are different for each of the three signal categories:

  • ABP: no preprocessing, use as-is
  • ECG: apply a 1-40Hz bandpass filter, then perform Z-score normalization
  • EEG: apply a 0.5-50Hz bandpass filter

apply_bandpass_filter() implements the bandpass filter using scipy.signal

apply_zscore_normalization() implements the Z-score normalization using numpy

In [31]:
from scipy.signal import butter, lfilter, spectrogram

# define two methods for data preprocessing

def apply_bandpass_filter(data, lowcut, highcut, fs, order=5):
    b, a = butter(order, [lowcut, highcut], fs=fs, btype='band')
    y = lfilter(b, a, np.nan_to_num(data))
    return y

def apply_zscore_normalization(signal):
    mean = np.nanmean(signal)
    std = np.nanstd(signal)
    return (signal - mean) / std
In [32]:
# Filtering Demonstration

# temp experimental, code to be incorporated into overall preloader process
# for now it's just dumping example plots of the before/after filtered signal data
caseidx = 1
file_path = f"{VITAL_MINI}/{caseidx:04d}_mini.vital"
vf = vitaldb.VitalFile(file_path, TRACK_NAMES)

originalAbp = None
filteredAbp = None
originalEcg = None
filteredEcg = None
originalEeg = None
filteredEeg = None

ABP_TRACK_NAME = "SNUADC/ART"
ECG_TRACK_NAME = "SNUADC/ECG_II"
EEG_TRACK_NAME = "BIS/EEG1_WAV"

for i, (track_name, rate) in enumerate(zip(TRACK_NAMES, TRACK_SRATES)):
    # Get samples for this track
    track_samples = vf.get_track_samples(track_name, 1/rate)
    #track_samples, _ = vf.get_samples(track_name, 1/rate)
    print(f"Track {track_name} @ {rate}Hz shape {len(track_samples)}")

    if track_name == ABP_TRACK_NAME:
        # ABP waveforms are used without further pre-processing
        originalAbp = track_samples
        filteredAbp = track_samples
    elif track_name == ECG_TRACK_NAME:
        originalEcg = track_samples
        # ECG waveforms are band-pass filtered between 1 and 40 Hz, and Z-score normalized
        # first apply bandpass filter
        filteredEcg = apply_bandpass_filter(track_samples, 1, 40, rate)
        # then do z-score normalization
        filteredEcg = apply_zscore_normalization(filteredEcg)
    elif track_name == EEG_TRACK_NAME:
        # EEG waveforms are band-pass filtered between 0.5 and 50 Hz
        originalEeg = track_samples
        filteredEeg = apply_bandpass_filter(track_samples, 0.5, 50, rate, 2)

def plotSignal(data, title):
    plt.figure(figsize=(20, 5))
    plt.plot(data)
    plt.title(title)
    plt.show()

plotSignal(originalAbp, "Original ABP")
plotSignal(originalAbp, "Unfiltered ABP")
plotSignal(originalEcg, "Original ECG")
plotSignal(filteredEcg, "Filtered ECG")
plotSignal(originalEeg, "Original EEG")
plotSignal(filteredEeg, "Filtered EEG")
Track SNUADC/ART @ 500Hz shape 5771049
Track SNUADC/ECG_II @ 500Hz shape 5771049
Track BIS/EEG1_WAV @ 128Hz shape 1477389
In [33]:
# Preprocess data tracks
ABP_TRACK_NAME = "SNUADC/ART"
ECG_TRACK_NAME = "SNUADC/ECG_II"
EEG_TRACK_NAME = "BIS/EEG1_WAV"
EVENT_TRACK_NAME = "EVENT"
MINI_FILE_FOLDER = VITAL_MINI
CACHE_FILE_FOLDER = VITAL_PREPROCESS_SCRATCH

if RESET_CACHE:
    TRACK_CACHE = None
    SEGMENT_CACHE = None

if TRACK_CACHE is None:
    TRACK_CACHE = {}
    SEGMENT_CACHE = {}

def get_track_data(case, print_when_file_loaded = False):
    parsedFile = None
    abp = None
    eeg = None
    ecg = None
    events = None

    for i, (track_name, rate) in enumerate(zip(EXTRACTION_TRACK_NAMES, EXTRACTION_TRACK_SRATES)):
        # use integer case id and track name, delimited by pipe, as cache key
        cache_label = f"{case}|{track_name}"
        
        if cache_label not in TRACK_CACHE:
            if parsedFile is None:
                file_path = f"{MINI_FILE_FOLDER}/{case:04d}_mini.vital"
                if print_when_file_loaded:
                    print(f"[{datetime.now()}] Loading vital file {file_path}")
                parsedFile = vitaldb.VitalFile(file_path, EXTRACTION_TRACK_NAMES)
            
            dataset = np.array(parsedFile.get_track_samples(track_name, 1/rate))
            
            if track_name == ABP_TRACK_NAME:
                # no filtering for ABP
                abp = dataset
                abp = pd.DataFrame(abp).ffill(axis=0).bfill(axis=0)[0].values
                if USE_MEMORY_CACHING:
                    TRACK_CACHE[cache_label] = abp
            elif track_name == ECG_TRACK_NAME:
                ecg = dataset
                # apply ECG filtering: first bandpass then do z-score normalization
                ecg = pd.DataFrame(ecg).ffill(axis=0).bfill(axis=0)[0].values
                ecg = apply_bandpass_filter(ecg, 1, 40, rate, 2)
                ecg = apply_zscore_normalization(ecg)
                
                if USE_MEMORY_CACHING:
                    TRACK_CACHE[cache_label] = ecg
            elif track_name == EEG_TRACK_NAME:
                eeg = dataset
                eeg = pd.DataFrame(eeg).ffill(axis=0).bfill(axis=0)[0].values
                # apply EEG filtering: bandpass only
                eeg = apply_bandpass_filter(eeg, 0.5, 50, rate, 2)
                if USE_MEMORY_CACHING:
                    TRACK_CACHE[cache_label] = eeg
            elif track_name == EVENT_TRACK_NAME:
                events = dataset
                if USE_MEMORY_CACHING:
                    TRACK_CACHE[cache_label] = events
        else:
            # cache hit, pull from cache
            if track_name == ABP_TRACK_NAME:
                abp = TRACK_CACHE[cache_label]
            elif track_name == ECG_TRACK_NAME:
                ecg = TRACK_CACHE[cache_label]
            elif track_name == EEG_TRACK_NAME:
                eeg = TRACK_CACHE[cache_label]
            elif track_name == EVENT_TRACK_NAME:
                events = TRACK_CACHE[cache_label]

    return (abp, ecg, eeg, events)

# ABP waveforms are used without further pre-processing
# ECG waveforms are band-pass filtered between 1 and 40 Hz, and Z-score normalized
# EEG waveforms are band-pass filtered between 0.5 and 50 Hz
if PRELOADING_CASES:
    # determine disk cache file label
    maxlabel = "ALL"
    if MAX_CASES is not None:
        maxlabel = str(MAX_CASES)
    picklefile = f"{CACHE_FILE_FOLDER}/{PREDICTION_WINDOW}_minutes_MAX{maxlabel}.trackcache"

    for track in tqdm(cases_of_interest_idx):
        # getting track data will cause a cache-check and fill when missing
        # will also apply appropriate filtering per track
        get_track_data(track, False)
    
    print(f"Generated track cache, {len(TRACK_CACHE)} records generated")


def get_segment_data(file_path):
    abp = None
    eeg = None
    ecg = None

    if USE_MEMORY_CACHING:
        if file_path in SEGMENT_CACHE:
            (abp, ecg, eeg) = SEGMENT_CACHE[file_path]
            return (abp, ecg, eeg)

    try:
        with h5py.File(file_path, 'r') as f:
            abp = np.array(f['abp'])
            ecg = np.array(f['ecg'])
            eeg = np.array(f['eeg'])
        
        abp = np.array(abp)
        eeg = np.array(eeg)
        ecg = np.array(ecg)

        if len(abp) > 30000:
            abp = abp[:30000]
        elif len(abp) < 30000:
            abp = np.resize(abp, (30000))

        if len(ecg) > 30000:
            ecg = ecg[:30000]
        elif len(ecg) < 30000:
            ecg = np.resize(ecg, (30000))

        if len(eeg) > 7680:
            eeg = eeg[:7680]
        elif len(eeg) < 7680:
            eeg = np.resize(eeg, (7680))

        if USE_MEMORY_CACHING:
            SEGMENT_CACHE[file_path] = (abp, ecg, eeg)
    except:
        abp = None
        ecg = None
        eeg = None

    return (abp, ecg, eeg)

The following method is adapted from the preprocessing block of reference [6] (https://github.com/vitaldb/examples/blob/master/hypotension_art.ipynb)

The approach first finds an interoperative hypotensive event in the ABP waveform. It then backtracks to earlier in the waveform to extract a 60 second segment representing the waveform feature to use as model input. The figure below shows an example of this approach and is reproduced from the VitalDB example notebook referenced above.

Feature segment extraction

In [34]:
def getSurgeryBoundariesInSeconds(event, debug=False):
    eventIndices = np.argwhere(event==event)
    # we are looking for the last index where the string contains 'start
    lastStart = 0
    firstFinish = len(event)-1
    
    # find last start
    for idx in eventIndices:
        if 'started' in event[idx[0]]:
            if debug:
                print(event[idx[0]])
                print(idx[0])
            lastStart = idx[0]
    
    # find first finish
    for idx in eventIndices:
        if 'finish' in event[idx[0]]:
            if debug:
                print(event[idx[0]])
                print(idx[0])

            firstFinish = idx[0]
            break
    
    if debug:
        print(f'lastStart, firstFinish: {lastStart}, {firstFinish}')
    return (lastStart, firstFinish)
In [35]:
def areCaseSegmentsCached(caseid):
    seg_folder = f"{VITAL_EXTRACTED_SEGMENTS}/{caseid:04d}"
    return os.path.exists(seg_folder) and len(os.listdir(seg_folder)) > 0
In [36]:
def isAbpSegmentValidNumpy(samples, debug=False):
    valid = True
    if np.isnan(samples).mean() > 0.1:
        valid = False
        if debug:
            print(f">10% NaN")
    elif (samples > 200).any():
        valid = False
        if debug:
            print(f"Presence of BP > 200")
    elif (samples < 30).any():
        valid = False
        if debug:
            print(f"Presence of BP < 30")
    elif np.max(samples) - np.min(samples) < 30:
        if debug:
            print(f"Max - Min test < 30")
        valid = False
    elif (np.abs(np.diff(samples)) > 30).any():  # abrupt change -> noise
        if debug:
            print(f"Abrupt change (noise)")
        valid = False
    
    return valid
In [37]:
def isAbpSegmentValid(vf, debug=False):
    ABP_ECG_SRATE_HZ = 500
    ABP_TRACK_NAME = "SNUADC/ART"

    samples = np.array(vf.get_track_samples(ABP_TRACK_NAME, 1/ABP_ECG_SRATE_HZ))
    return isAbpSegmentValidNumpy(samples, debug)
In [38]:
def saveCaseSegments(caseid, positiveSegments, negativeSegments, compresslevel=9, debug=False, forceWrite=False):
    if len(positiveSegments) == 0 and len(negativeSegments) == 0:
        # exit early if no events found
        print(f'{caseid}: exit early, no segments to save')
        return

    # event composition
    # predictiveSegmentStart in seconds, predictiveSegmentEnd in seconds, predWindow (0 for negative), abp, ecg, eeg)
    # 0start, 1end, 2predwindow, 3abp, 4ecg, 5eeg

    seg_folder = f"{VITAL_EXTRACTED_SEGMENTS}/{caseid:04d}"
    if not os.path.exists(seg_folder):
        # if directory needs to be created, then there are no cached segments
        os.mkdir(seg_folder)
    else:
        if not forceWrite:
            # exit early if folder already exists, case already produced
            return

    # prior to writing files out, clear existing files
    for filename in os.listdir(seg_folder):
        file_path = os.path.join(seg_folder, filename)
        if debug:
            print(f'deleting: {file_path}')
        try:
            if os.path.isfile(file_path):
                os.unlink(file_path)
        except Exception as e:
            print('Failed to delete %s. Reason: %s' % (file_path, e))
    
    count_pos_saved = 0
    for i in range(0, len(positiveSegments)):
        event = positiveSegments[i]
        startIndex = event[0]
        endIndex = event[1]
        predWindow = event[2]
        abp = event[3]
        #ecg = event[4]
        #eeg = event[5]

        seg_filename = f"{caseid:04d}_{startIndex}_{predWindow:02d}_True.h5"
        seg_fullpath = f"{seg_folder}/{seg_filename}"
        if isAbpSegmentValidNumpy(abp, debug):
            count_pos_saved += 1

            abp = abp.tolist()
            ecg = event[4].tolist()
            eeg = event[5].tolist()
        
            f = h5py.File(seg_fullpath, "w")
            f.create_dataset('abp', data=abp, compression="gzip", compression_opts=compresslevel)
            f.create_dataset('ecg', data=ecg, compression="gzip", compression_opts=compresslevel)
            f.create_dataset('eeg', data=eeg, compression="gzip", compression_opts=compresslevel)
            
            f.flush()
            f.close()
            f = None

            abp = None
            ecg = None
            eeg = None

            # f.create_dataset('label', data=[1], compression="gzip", compression_opts=compresslevel)
            # f.create_dataset('pred_window', data=[event[2]], compression="gzip", compression_opts=compresslevel)
            # f.create_dataset('caseid', data=[caseid], compression="gzip", compression_opts=compresslevel)
        elif debug:
            print(f"{caseid:04d} {predWindow:02d}min {startIndex} starttime = ignored, segment validity issues")

    count_neg_saved = 0
    for i in range(0, len(negativeSegments)):
        event = negativeSegments[i]
        startIndex = event[0]
        endIndex = event[1]
        predWindow = event[2]
        abp = event[3]
        #ecg = event[4]
        #eeg = event[5]

        seg_filename = f"{caseid:04d}_{startIndex}_0_False.h5"
        seg_fullpath = f"{seg_folder}/{seg_filename}"
        if isAbpSegmentValidNumpy(abp, debug):
            count_neg_saved += 1

            abp = abp.tolist()
            ecg = event[4].tolist()
            eeg = event[5].tolist()
            
            f = h5py.File(seg_fullpath, "w")
            f.create_dataset('abp', data=abp, compression="gzip", compression_opts=compresslevel)
            f.create_dataset('ecg', data=ecg, compression="gzip", compression_opts=compresslevel)
            f.create_dataset('eeg', data=eeg, compression="gzip", compression_opts=compresslevel)
            
            f.flush()
            f.close()
            f = None

            abp = None
            ecg = None
            eeg = None

            # f.create_dataset('label', data=[0], compression="gzip", compression_opts=compresslevel)
            # f.create_dataset('pred_window', data=[0], compression="gzip", compression_opts=compresslevel)
            # f.create_dataset('caseid', data=[caseid], compression="gzip", compression_opts=compresslevel)
        elif debug:
            print(f"{caseid:04d} CleanWindow {startIndex} starttime = ignored, segment validity issues")
            
    if count_neg_saved == 0 and count_pos_saved == 0:
        print(f'{caseid}: nothing saved, all segments filtered')
In [39]:
# Generate hypotensive events
# Hypotensive events are defined as a 1-minute interval with sustained ABP of less than 65 mmHg
# Note: Hypotensive events should be at least 20 minutes apart to minimize potential residual effects from previous events
# Generate hypotension non-events
# To sample non-events, 30-minute segments where the ABP was above 75 mmHG were selected, and then
# three one-minute samples of each waveform were obtained from the middle of the segment
# both occur in extract_segments
#VITAL_EXTRACTED_SEGMENTS
def extract_segments(
    cases_of_interest_idx,
    debug=False,
    checkCache=True,
    forceWrite=False,
    returnSegments=False,
    skipInvalidCleanEvents=False,
    skipInvalidIohEvents=False
):
    # Sampling rate for ABP and ECG, Hz. These rates should be the same. Default = 500
    ABP_ECG_SRATE_HZ = 500

    # Sampling rate for EEG. Default = 128
    EEG_SRATE_HZ = 128

    # Final dataset for training and testing the model.
    positiveSegmentsMap = {}
    negativeSegmentsMap = {}
    iohEventsMap = {}
    cleanEventsMap = {}

    # Process each case and extract segments. For each segment identify presence of an event in the label zone.
    count_cases = len(cases_of_interest_idx)

    #for case_count, caseid in tqdm(enumerate(cases_of_interest_idx), total=count_cases):
    for case_count, caseid in enumerate(cases_of_interest_idx):
        if debug:
            print(f'Loading case: {caseid:04d}, ({case_count + 1} of {count_cases})')

        if checkCache and areCaseSegmentsCached(caseid):
            if debug:
                print(f'Skipping case: {caseid:04d}, already cached')
            # skip records we've already cached
            continue

        # read the arterial waveform
        (abp, ecg, eeg, event) = get_track_data(caseid)
        if debug:
            print(f'Length of {TRACK_NAMES[0]}:       {abp.shape[0]}')
            print(f'Length of {TRACK_NAMES[1]}:    {ecg.shape[0]}')
            print(f'Length of {TRACK_NAMES[2]}:     {eeg.shape[0]}')

        (startInSeconds, endInSeconds) = getSurgeryBoundariesInSeconds(event)
        if debug:
            print(f"Event markers indicate that surgery begins at {startInSeconds}s and ends at {endInSeconds}s.")

        #track_length_seconds = int(len(abp) / ABP_ECG_SRATE_HZ)
        track_length_seconds = endInSeconds
        
        if debug:
            print(f"Processing case {caseid} with length {track_length_seconds}s")

        
        # check if the ABP segment in the surgery window is valid
        if debug:
            isSurgerySegmentValid = \
                isAbpSegmentValidNumpy(abp[startInSeconds * ABP_ECG_SRATE_HZ:endInSeconds * ABP_ECG_SRATE_HZ])
            print(f'{caseid}: surgery segment valid: {isSurgerySegmentValid}')
        
        iohEvents = []
        cleanEvents = []
        i = 0
        started = False
        eofReached = False
        trackStartIndex = None

        # set i pointer (which operates in seconds) to start marker for surgery
        i = startInSeconds

        # FIRST PASS
        # in the first forward pass, we are going to identify the start/end boundaries of all IOH events within the case
        ioh_events_valid = []
        
        while i < track_length_seconds - 60 and i < endInSeconds:
            segmentStart = None
            segmentEnd = None
            segFound = False

            # look forward one minute
            abpSeg = abp[i * ABP_ECG_SRATE_HZ:(i + 60) * ABP_ECG_SRATE_HZ]

            # roll forward until we hit a one minute window where mean ABP >= 65 so we know leads are connected and it's tracking
            if not started:
                if np.nanmean(abpSeg) >= 65:
                    started = True
                    trackStartIndex = i
            # if we're started and mean abp for the window is <65, we are starting a new IOH event
            elif np.nanmean(abpSeg) < 65:
                segmentStart = i
                # now seek forward to find end of event, perpetually checking the lats minute of the IOH event
                for j in range(i + 60, track_length_seconds):
                    # look backward one minute
                    abpSegForward = abp[(j - 60) * ABP_ECG_SRATE_HZ:j * ABP_ECG_SRATE_HZ]
                    if np.nanmean(abpSegForward) >= 65:
                        segmentEnd = j - 1
                        break
                if segmentEnd is None:
                    eofReached = True
                else:
                    # otherwise, end of the IOH segment has been reached, record it
                    iohEvents.append((segmentStart, segmentEnd))
                    segFound = True
                    
                    if skipInvalidIohEvents:
                        isIohSegmentValid = isAbpSegmentValidNumpy(abpSeg)
                        ioh_events_valid.append(isIohSegmentValid)
                        if debug:
                            print(f'{caseid}: ioh segment valid: {isIohSegmentValid}, {segmentStart}, {segmentEnd}, {t_abp.shape}')
                    else:
                        ioh_events_valid.append(True)

            i += 1
            if not started:
                continue
            elif eofReached:
                break
            elif segFound:
                i = segmentEnd + 1

        # SECOND PASS
        # in the second forward pass, we are going to identify the start/end boundaries of all non-overlapping 30 minute "clean" windows
        # reuse the 'start of signal' index from our first pass
        if trackStartIndex is None:
            trackStartIndex = startInSeconds
        i = trackStartIndex
        eofReached = False

        clean_events_valid = []
        
        while i < track_length_seconds - 1800 and i < endInSeconds:
            segmentStart = None
            segmentEnd = None
            segFound = False

            startIndex = i
            endIndex = i + 1800

            # check to see if this 30 minute window overlaps any IOH events, if so ffwd to end of latest overlapping IOH
            overlapFound = False
            latestEnd = None
            for event in iohEvents:
                # case 1: starts during an event
                if startIndex >= event[0] and startIndex < event[1]:
                    latestEnd = event[1]
                    overlapFound = True
                # case 2: ends during an event
                elif endIndex >= event[0] and endIndex < event[1]:
                    latestEnd = event[1]
                    overlapFound = True
                # case 3: event occurs entirely inside of the window
                elif startIndex < event[0] and endIndex > event[1]:
                    latestEnd = event[1]
                    overlapFound = True

            # FFWD if we found an overlap
            if overlapFound:
                i = latestEnd + 1
                continue

            # look forward 30 minutes
            abpSeg = abp[startIndex * ABP_ECG_SRATE_HZ:endIndex * ABP_ECG_SRATE_HZ]

            # if we're started and mean abp for the window is >= 75, we are starting a new clean event
            if np.nanmean(abpSeg) >= 75:
                overlapFound = False
                latestEnd = None
                for event in iohEvents:
                    # case 1: starts during an event
                    if startIndex >= event[0] and startIndex < event[1]:
                        latestEnd = event[1]
                        overlapFound = True
                    # case 2: ends during an event
                    elif endIndex >= event[0] and endIndex < event[1]:
                        latestEnd = event[1]
                        overlapFound = True
                    # case 3: event occurs entirely inside of the window
                    elif startIndex < event[0] and endIndex > event[1]:
                        latestEnd = event[1]
                        overlapFound = True

                if not overlapFound:
                    segFound = True
                    segmentEnd = endIndex
                    cleanEvents.append((startIndex, endIndex))
                    
                    if skipInvalidCleanEvents:
                        isCleanSegmentValid = isAbpSegmentValidNumpy(abpSeg)
                        clean_events_valid.append(isCleanSegmentValid)
                        if debug:
                            print(f'{caseid}: clean segment valid: {isCleanSegmentValid}, {startIndex}, {endIndex}, {abpSeg.shape}')
                    else:
                        clean_events_valid.append(True)

            i += 10
            if segFound:
                i = segmentEnd + 1

        if debug:
            print(f"IOH Events for case {caseid}: {iohEvents}")
            print(f"Clean Events for case {caseid}: {cleanEvents}")

        positiveSegments = []
        negativeSegments = []

        # THIRD PASS
        # in the third pass, we will use the collections of ioh event windows to generate our actual extracted segments based on our prediction window (positive labels)
        for i in range(0, len(iohEvents)):
            # Don't extract segments from invalid IOH event windows.
            if not ioh_events_valid[i]:
                continue

            if debug:
                print(f"Checking event {iohEvents[i]}")
            # we want to review current event boundaries, as well as previous event boundaries if available
            event = iohEvents[i]
            previousEvent = None
            if i > 0:
                previousEvent = iohEvents[i - 1]

            for predWindow in ALL_PREDICTION_WINDOWS:
                if debug:
                    print(f"Checking event {iohEvents[i]} for pred {predWindow}")
                iohEventStart = event[0]
                predictiveSegmentEnd = event[0] - (predWindow*60)
                predictiveSegmentStart = predictiveSegmentEnd - 60

                if (predictiveSegmentStart < 0):
                    # don't rewind before the beginning of the track
                    if debug:
                        print(f"Checking event {iohEvents[i]} for pred {predWindow} - exit, before beginning")
                    continue
                elif (predictiveSegmentStart < trackStartIndex):
                    # don't rewind before the beginning of signal in track
                    if debug:
                        print(f"Checking event {iohEvents[i]} for pred {predWindow} - exit, before track start")
                    continue
                elif previousEvent is not None:
                    # does this event window come before or during the previous event?
                    overlapFound = False
                    # case 1: starts during an event
                    if predictiveSegmentStart >= previousEvent[0] and predictiveSegmentStart < previousEvent[1]:
                        overlapFound = True
                    # case 2: ends during an event
                    elif iohEventStart >= previousEvent[0] and iohEventStart < previousEvent[1]:
                        overlapFound = True
                    # case 3: event occurs entirely inside of the window
                    elif predictiveSegmentStart < previousEvent[0] and iohEventStart > previousEvent[1]:
                        overlapFound = True
                    # do not extract a case if we overlap witha nother IOH
                    if overlapFound:
                        if debug:
                            print(f"Checking event {iohEvents[i]} for pred {predWindow} - exit, overlap with earlier segment")
                        continue

                # track the positive segment
                positiveSegments.append((predictiveSegmentStart, predictiveSegmentEnd, predWindow,
                    abp[predictiveSegmentStart*ABP_ECG_SRATE_HZ:predictiveSegmentEnd*ABP_ECG_SRATE_HZ],
                    ecg[predictiveSegmentStart*ABP_ECG_SRATE_HZ:predictiveSegmentEnd*ABP_ECG_SRATE_HZ],
                    eeg[predictiveSegmentStart*EEG_SRATE_HZ:predictiveSegmentEnd*EEG_SRATE_HZ]))

        # FOURTH PASS
        # in the fourth and final pass, we will use the collections of clean event windows to generate our actual extracted segments based (negative labels)
        for i in range(0, len(cleanEvents)):
            # Don't extract segments from invalid clean event windows.
            if not clean_events_valid[i]:
                continue
            
            # everything will be 30 minutes long at least
            event = cleanEvents[i]
            # choose sample 1 @ 10 minutes
            # choose sample 2 @ 15 minutes
            # choose sample 3 @ 20 minutes
            timeAtTen = event[0] + 600
            timeAtFifteen = event[0] + 900
            timeAtTwenty = event[0] + 1200

            negativeSegments.append((timeAtTen, timeAtTen + 60, 0,
                                   abp[timeAtTen*ABP_ECG_SRATE_HZ:(timeAtTen + 60)*ABP_ECG_SRATE_HZ],
                                   ecg[timeAtTen*ABP_ECG_SRATE_HZ:(timeAtTen + 60)*ABP_ECG_SRATE_HZ],
                                   eeg[timeAtTen*EEG_SRATE_HZ:(timeAtTen + 60)*EEG_SRATE_HZ]))
            negativeSegments.append((timeAtFifteen, timeAtFifteen + 60, 0,
                                   abp[timeAtFifteen*ABP_ECG_SRATE_HZ:(timeAtFifteen + 60)*ABP_ECG_SRATE_HZ],
                                   ecg[timeAtFifteen*ABP_ECG_SRATE_HZ:(timeAtFifteen + 60)*ABP_ECG_SRATE_HZ],
                                   eeg[timeAtFifteen*EEG_SRATE_HZ:(timeAtFifteen + 60)*EEG_SRATE_HZ]))
            negativeSegments.append((timeAtTwenty, timeAtTwenty + 60, 0,
                                   abp[timeAtTwenty*ABP_ECG_SRATE_HZ:(timeAtTwenty + 60)*ABP_ECG_SRATE_HZ],
                                   ecg[timeAtTwenty*ABP_ECG_SRATE_HZ:(timeAtTwenty + 60)*ABP_ECG_SRATE_HZ],
                                   eeg[timeAtTwenty*EEG_SRATE_HZ:(timeAtTwenty + 60)*EEG_SRATE_HZ]))

        if returnSegments:
            positiveSegmentsMap[caseid] = positiveSegments
            negativeSegmentsMap[caseid] = negativeSegments
            iohEventsMap[caseid] = iohEvents
            cleanEventsMap[caseid] = cleanEvents
        
        saveCaseSegments(caseid, positiveSegments, negativeSegments, 9, debug=debug, forceWrite=forceWrite)

        #if debug:
        print(f'{caseid}: positiveSegments: {len(positiveSegments)}, negativeSegments: {len(negativeSegments)}')

    return positiveSegmentsMap, negativeSegmentsMap, iohEventsMap, cleanEventsMap

Case Extraction - Generage Segments Needed For Training¶

Ensure that all needed segments are in place for the cases that are being used. If data is already stored on disk this method returns immediately.

In [40]:
print('Time to extract segments!')
Time to extract segments!
In [41]:
MANUAL_EXTRACT=True
SKIP_INVALID_CLEAN_EVENTS=True
SKIP_INVALID_IOH_EVENTS=True

if MANUAL_EXTRACT:
    mycoi = cases_of_interest_idx
    #mycoi = cases_of_interest_idx[:2800]
    #mycoi = [1]

    cnt = 0
    mod = 0
    for ci in mycoi:
        cnt += 1
        if mod % 100 == 0:
            print(f'count processed: {mod}, current case index: {ci}')
        try:
            p, n, i, c = extract_segments([ci], debug=False, checkCache=True, 
                                          forceWrite=True, returnSegments=False, 
                                          skipInvalidCleanEvents=SKIP_INVALID_CLEAN_EVENTS,
                                          skipInvalidIohEvents=SKIP_INVALID_IOH_EVENTS)
            p = None
            n = None
            i = None
            c = None
        except:
            print(f'error on extract segment: {ci}')
        mod += 1
    print(f'extracted: {cnt}')
count processed: 0, current case index: 1
1: positiveSegments: 8, negativeSegments: 3
4: positiveSegments: 22, negativeSegments: 3
7: positiveSegments: 8, negativeSegments: 6
10: positiveSegments: 20, negativeSegments: 6
12: positiveSegments: 22, negativeSegments: 0
13: positiveSegments: 8, negativeSegments: 0
16: positiveSegments: 8, negativeSegments: 6
19: positiveSegments: 33, negativeSegments: 3
20: positiveSegments: 8, negativeSegments: 6
22: positiveSegments: 8, negativeSegments: 12
24: positiveSegments: 0, negativeSegments: 3
25: positiveSegments: 1, negativeSegments: 12
27: positiveSegments: 8, negativeSegments: 12
29: positiveSegments: 4, negativeSegments: 12
31: positiveSegments: 0, negativeSegments: 3
34: positiveSegments: 0, negativeSegments: 9
38: positiveSegments: 6, negativeSegments: 0
43: positiveSegments: 14, negativeSegments: 3
44: positiveSegments: 0, negativeSegments: 6
46: positiveSegments: 0, negativeSegments: 6
50: positiveSegments: 10, negativeSegments: 3
52: positiveSegments: 12, negativeSegments: 3
53: positiveSegments: 0, negativeSegments: 9
55: positiveSegments: 15, negativeSegments: 3
58: positiveSegments: 12, negativeSegments: 0
59: positiveSegments: 2, negativeSegments: 0
60: positiveSegments: 4, negativeSegments: 3
61: positiveSegments: 8, negativeSegments: 0
64: positiveSegments: 7, negativeSegments: 9
65: positiveSegments: 0, negativeSegments: 3
66: positiveSegments: 8, negativeSegments: 6
67: positiveSegments: 4, negativeSegments: 0
68: positiveSegments: 0, negativeSegments: 3
69: positiveSegments: 0, negativeSegments: 3
70: positiveSegments: 0, negativeSegments: 6
74: positiveSegments: 0, negativeSegments: 6
75: positiveSegments: 38, negativeSegments: 6
77: positiveSegments: 0, negativeSegments: 9
79: positiveSegments: 12, negativeSegments: 12
83: positiveSegments: 12, negativeSegments: 0
84: positiveSegments: 4, negativeSegments: 15
87: positiveSegments: 9, negativeSegments: 0
89: positiveSegments: 0, negativeSegments: 21
92: positiveSegments: 2, negativeSegments: 0
93: positiveSegments: 3, negativeSegments: 0
94: positiveSegments: 18, negativeSegments: 6
96: positiveSegments: 26, negativeSegments: 15
97: positiveSegments: 8, negativeSegments: 0
98: positiveSegments: 3, negativeSegments: 3
101: positiveSegments: 0, negativeSegments: 6
104: positiveSegments: 7, negativeSegments: 0
105: positiveSegments: 18, negativeSegments: 0
108: positiveSegments: 5, negativeSegments: 0
110: positiveSegments: 8, negativeSegments: 0
111: positiveSegments: 7, negativeSegments: 0
112: positiveSegments: 11, negativeSegments: 0
114: positiveSegments: 8, negativeSegments: 9
116: positiveSegments: 12, negativeSegments: 0
117: positiveSegments: 9, negativeSegments: 3
118: positiveSegments: 44, negativeSegments: 0
119: positiveSegments: 0, negativeSegments: 12
124: positiveSegments: 0, negativeSegments: 3
125: positiveSegments: 4, negativeSegments: 6
128: positiveSegments: 0, negativeSegments: 3
130: positiveSegments: 0, negativeSegments: 9
132: positiveSegments: 0, negativeSegments: 3
135: positiveSegments: 13, negativeSegments: 0
136: positiveSegments: 0, negativeSegments: 9
137: positiveSegments: 3, negativeSegments: 0
138: positiveSegments: 0, negativeSegments: 9
140: positiveSegments: 0, negativeSegments: 12
142: positiveSegments: 0, negativeSegments: 12
143: positiveSegments: 8, negativeSegments: 3
145: positiveSegments: 0, negativeSegments: 6
146: positiveSegments: 8, negativeSegments: 0
148: positiveSegments: 13, negativeSegments: 0
149: positiveSegments: 4, negativeSegments: 0
153: positiveSegments: 4, negativeSegments: 0
156: positiveSegments: 9, negativeSegments: 9
161: positiveSegments: 8, negativeSegments: 3
163: positiveSegments: 3, negativeSegments: 0
166: positiveSegments: 14, negativeSegments: 3
167: positiveSegments: 0, negativeSegments: 3
175: positiveSegments: 0, negativeSegments: 3
177: positiveSegments: 8, negativeSegments: 6
178: positiveSegments: 0, negativeSegments: 9
181: positiveSegments: 4, negativeSegments: 3
183: positiveSegments: 4, negativeSegments: 0
184: positiveSegments: 34, negativeSegments: 12
186: positiveSegments: 0, negativeSegments: 3
190: positiveSegments: 4, negativeSegments: 3
191: positiveSegments: 12, negativeSegments: 0
195: positiveSegments: 13, negativeSegments: 3
197: positiveSegments: 11, negativeSegments: 3
198: positiveSegments: 4, negativeSegments: 12
202: positiveSegments: 0, negativeSegments: 21
206: positiveSegments: 4, negativeSegments: 6
208: positiveSegments: 5, negativeSegments: 0
210: positiveSegments: 4, negativeSegments: 6
222: positiveSegments: 0, negativeSegments: 3
count processed: 100, current case index: 229
229: positiveSegments: 0, negativeSegments: 15
232: positiveSegments: 0, negativeSegments: 3
233: positiveSegments: 6, negativeSegments: 0
234: positiveSegments: 0, negativeSegments: 9
236: positiveSegments: 6, negativeSegments: 15
237: positiveSegments: 0, negativeSegments: 9
239: positiveSegments: 0, negativeSegments: 12
241: positiveSegments: 26, negativeSegments: 12
244: positiveSegments: 9, negativeSegments: 0
247: positiveSegments: 0, negativeSegments: 18
250: positiveSegments: 0, negativeSegments: 3
251: positiveSegments: 20, negativeSegments: 0
252: positiveSegments: 10, negativeSegments: 9
256: positiveSegments: 10, negativeSegments: 3
261: positiveSegments: 4, negativeSegments: 3
263: positiveSegments: 0, negativeSegments: 3
266: positiveSegments: 0, negativeSegments: 12
268: exit early, no segments to save
268: positiveSegments: 0, negativeSegments: 0
269: positiveSegments: 8, negativeSegments: 6
270: positiveSegments: 2, negativeSegments: 0
279: positiveSegments: 0, negativeSegments: 6
281: positiveSegments: 4, negativeSegments: 6
282: positiveSegments: 0, negativeSegments: 6
283: positiveSegments: 4, negativeSegments: 3
286: positiveSegments: 0, negativeSegments: 9
287: positiveSegments: 0, negativeSegments: 6
293: positiveSegments: 0, negativeSegments: 3
295: positiveSegments: 13, negativeSegments: 6
296: positiveSegments: 0, negativeSegments: 6
297: positiveSegments: 0, negativeSegments: 6
300: positiveSegments: 5, negativeSegments: 3
303: positiveSegments: 7, negativeSegments: 9
304: positiveSegments: 3, negativeSegments: 9
306: positiveSegments: 0, negativeSegments: 12
308: positiveSegments: 4, negativeSegments: 6
309: positiveSegments: 7, negativeSegments: 0
312: positiveSegments: 4, negativeSegments: 3
316: positiveSegments: 4, negativeSegments: 0
318: positiveSegments: 5, negativeSegments: 0
321: positiveSegments: 4, negativeSegments: 9
323: positiveSegments: 5, negativeSegments: 3
327: positiveSegments: 35, negativeSegments: 0
330: positiveSegments: 4, negativeSegments: 6
337: positiveSegments: 2, negativeSegments: 0
342: positiveSegments: 0, negativeSegments: 6
343: positiveSegments: 0, negativeSegments: 6
345: positiveSegments: 0, negativeSegments: 15
348: positiveSegments: 8, negativeSegments: 3
349: positiveSegments: 6, negativeSegments: 0
353: positiveSegments: 0, negativeSegments: 12
354: positiveSegments: 21, negativeSegments: 0
355: positiveSegments: 9, negativeSegments: 3
357: positiveSegments: 4, negativeSegments: 3
358: positiveSegments: 4, negativeSegments: 0
359: positiveSegments: 10, negativeSegments: 6
362: positiveSegments: 0, negativeSegments: 9
363: positiveSegments: 12, negativeSegments: 3
367: positiveSegments: 4, negativeSegments: 3
369: positiveSegments: 0, negativeSegments: 9
370: positiveSegments: 3, negativeSegments: 0
371: positiveSegments: 4, negativeSegments: 3
375: positiveSegments: 24, negativeSegments: 0
382: positiveSegments: 4, negativeSegments: 12
383: positiveSegments: 4, negativeSegments: 0
384: positiveSegments: 8, negativeSegments: 0
387: positiveSegments: 3, negativeSegments: 0
388: positiveSegments: 4, negativeSegments: 6
390: positiveSegments: 9, negativeSegments: 9
397: positiveSegments: 13, negativeSegments: 0
398: positiveSegments: 0, negativeSegments: 6
402: positiveSegments: 0, negativeSegments: 6
404: positiveSegments: 0, negativeSegments: 6
406: positiveSegments: 4, negativeSegments: 9
409: positiveSegments: 9, negativeSegments: 0
415: positiveSegments: 8, negativeSegments: 0
416: positiveSegments: 11, negativeSegments: 0
417: positiveSegments: 9, negativeSegments: 15
418: positiveSegments: 22, negativeSegments: 0
419: positiveSegments: 0, negativeSegments: 6
427: positiveSegments: 0, negativeSegments: 6
431: positiveSegments: 9, negativeSegments: 0
435: positiveSegments: 0, negativeSegments: 6
439: positiveSegments: 7, negativeSegments: 0
440: positiveSegments: 4, negativeSegments: 6
442: positiveSegments: 3, negativeSegments: 0
445: positiveSegments: 8, negativeSegments: 18
447: positiveSegments: 0, negativeSegments: 3
448: positiveSegments: 0, negativeSegments: 12
449: positiveSegments: 14, negativeSegments: 0
451: positiveSegments: 16, negativeSegments: 3
452: positiveSegments: 9, negativeSegments: 3
455: positiveSegments: 2, negativeSegments: 0
458: positiveSegments: 0, negativeSegments: 3
462: positiveSegments: 6, negativeSegments: 0
466: positiveSegments: 4, negativeSegments: 9
469: positiveSegments: 24, negativeSegments: 3
472: positiveSegments: 8, negativeSegments: 21
474: positiveSegments: 12, negativeSegments: 3
476: positiveSegments: 21, negativeSegments: 3
478: positiveSegments: 4, negativeSegments: 0
count processed: 200, current case index: 481
481: positiveSegments: 6, negativeSegments: 3
484: positiveSegments: 0, negativeSegments: 6
485: positiveSegments: 4, negativeSegments: 0
486: positiveSegments: 8, negativeSegments: 3
488: positiveSegments: 14, negativeSegments: 0
490: positiveSegments: 5, negativeSegments: 6
492: positiveSegments: 16, negativeSegments: 15
499: positiveSegments: 24, negativeSegments: 9
505: positiveSegments: 14, negativeSegments: 3
512: positiveSegments: 0, negativeSegments: 18
513: positiveSegments: 0, negativeSegments: 6
516: positiveSegments: 0, negativeSegments: 3
520: positiveSegments: 10, negativeSegments: 6
521: positiveSegments: 12, negativeSegments: 0
526: positiveSegments: 0, negativeSegments: 6
527: positiveSegments: 10, negativeSegments: 0
530: positiveSegments: 0, negativeSegments: 3
535: positiveSegments: 0, negativeSegments: 3
536: positiveSegments: 0, negativeSegments: 3
537: positiveSegments: 2, negativeSegments: 0
543: positiveSegments: 5, negativeSegments: 3
544: positiveSegments: 0, negativeSegments: 3
545: positiveSegments: 0, negativeSegments: 3
550: positiveSegments: 8, negativeSegments: 6
551: positiveSegments: 15, negativeSegments: 3
553: positiveSegments: 18, negativeSegments: 3
559: positiveSegments: 10, negativeSegments: 6
560: positiveSegments: 0, negativeSegments: 3
561: positiveSegments: 2, negativeSegments: 0
562: positiveSegments: 4, negativeSegments: 6
564: positiveSegments: 7, negativeSegments: 3
566: positiveSegments: 0, negativeSegments: 6
567: positiveSegments: 0, negativeSegments: 9
568: positiveSegments: 26, negativeSegments: 0
570: positiveSegments: 0, negativeSegments: 9
573: positiveSegments: 5, negativeSegments: 6
576: positiveSegments: 0, negativeSegments: 3
577: positiveSegments: 0, negativeSegments: 9
582: positiveSegments: 0, negativeSegments: 3
585: positiveSegments: 4, negativeSegments: 3
587: positiveSegments: 12, negativeSegments: 3
590: positiveSegments: 4, negativeSegments: 3
593: positiveSegments: 0, negativeSegments: 6
599: positiveSegments: 4, negativeSegments: 3
611: positiveSegments: 4, negativeSegments: 9
612: positiveSegments: 0, negativeSegments: 9
616: positiveSegments: 4, negativeSegments: 0
617: positiveSegments: 4, negativeSegments: 3
620: positiveSegments: 0, negativeSegments: 12
621: positiveSegments: 3, negativeSegments: 0
622: positiveSegments: 0, negativeSegments: 3
624: positiveSegments: 4, negativeSegments: 0
627: positiveSegments: 0, negativeSegments: 3
628: positiveSegments: 32, negativeSegments: 0
629: positiveSegments: 28, negativeSegments: 6
631: positiveSegments: 0, negativeSegments: 15
634: positiveSegments: 0, negativeSegments: 6
636: positiveSegments: 0, negativeSegments: 3
637: positiveSegments: 0, negativeSegments: 15
641: exit early, no segments to save
641: positiveSegments: 0, negativeSegments: 0
644: positiveSegments: 4, negativeSegments: 6
645: positiveSegments: 0, negativeSegments: 3
648: positiveSegments: 4, negativeSegments: 0
649: positiveSegments: 17, negativeSegments: 3
652: positiveSegments: 7, negativeSegments: 3
655: positiveSegments: 0, negativeSegments: 9
659: positiveSegments: 4, negativeSegments: 6
660: positiveSegments: 0, negativeSegments: 6
663: positiveSegments: 0, negativeSegments: 6
665: positiveSegments: 0, negativeSegments: 3
666: positiveSegments: 11, negativeSegments: 12
667: positiveSegments: 0, negativeSegments: 9
671: positiveSegments: 0, negativeSegments: 12
672: positiveSegments: 0, negativeSegments: 6
676: positiveSegments: 5, negativeSegments: 3
680: positiveSegments: 2, negativeSegments: 0
683: positiveSegments: 0, negativeSegments: 9
684: positiveSegments: 0, negativeSegments: 3
685: positiveSegments: 0, negativeSegments: 9
687: positiveSegments: 4, negativeSegments: 3
691: positiveSegments: 0, negativeSegments: 12
697: positiveSegments: 8, negativeSegments: 0
698: positiveSegments: 17, negativeSegments: 3
699: positiveSegments: 9, negativeSegments: 3
702: positiveSegments: 8, negativeSegments: 12
703: positiveSegments: 13, negativeSegments: 3
706: positiveSegments: 14, negativeSegments: 0
711: positiveSegments: 0, negativeSegments: 18
716: positiveSegments: 4, negativeSegments: 3
719: positiveSegments: 0, negativeSegments: 15
721: positiveSegments: 9, negativeSegments: 0
722: positiveSegments: 4, negativeSegments: 6
725: positiveSegments: 11, negativeSegments: 15
726: positiveSegments: 0, negativeSegments: 3
728: positiveSegments: 16, negativeSegments: 15
730: positiveSegments: 8, negativeSegments: 3
733: positiveSegments: 10, negativeSegments: 0
734: positiveSegments: 0, negativeSegments: 12
737: positiveSegments: 4, negativeSegments: 3
739: positiveSegments: 0, negativeSegments: 3
count processed: 300, current case index: 740
740: positiveSegments: 4, negativeSegments: 3
742: positiveSegments: 0, negativeSegments: 6
744: positiveSegments: 0, negativeSegments: 3
745: positiveSegments: 0, negativeSegments: 3
746: positiveSegments: 0, negativeSegments: 3
748: positiveSegments: 10, negativeSegments: 12
749: positiveSegments: 0, negativeSegments: 3
750: positiveSegments: 49, negativeSegments: 12
751: positiveSegments: 0, negativeSegments: 12
752: positiveSegments: 0, negativeSegments: 12
753: positiveSegments: 2, negativeSegments: 0
755: positiveSegments: 7, negativeSegments: 3
756: positiveSegments: 0, negativeSegments: 9
757: positiveSegments: 0, negativeSegments: 3
758: positiveSegments: 10, negativeSegments: 0
761: positiveSegments: 4, negativeSegments: 0
762: positiveSegments: 0, negativeSegments: 9
763: positiveSegments: 13, negativeSegments: 6
764: positiveSegments: 24, negativeSegments: 6
765: positiveSegments: 12, negativeSegments: 12
767: positiveSegments: 0, negativeSegments: 3
770: positiveSegments: 3, negativeSegments: 0
772: positiveSegments: 0, negativeSegments: 3
773: positiveSegments: 0, negativeSegments: 3
774: positiveSegments: 13, negativeSegments: 3
775: positiveSegments: 14, negativeSegments: 0
776: positiveSegments: 11, negativeSegments: 3
777: positiveSegments: 7, negativeSegments: 3
779: positiveSegments: 4, negativeSegments: 12
781: positiveSegments: 6, negativeSegments: 0
783: positiveSegments: 4, negativeSegments: 0
788: positiveSegments: 0, negativeSegments: 3
793: positiveSegments: 17, negativeSegments: 0
794: positiveSegments: 9, negativeSegments: 0
795: positiveSegments: 0, negativeSegments: 6
800: positiveSegments: 10, negativeSegments: 3
802: positiveSegments: 19, negativeSegments: 6
807: positiveSegments: 8, negativeSegments: 6
808: positiveSegments: 10, negativeSegments: 3
812: positiveSegments: 0, negativeSegments: 3
813: positiveSegments: 4, negativeSegments: 3
814: positiveSegments: 0, negativeSegments: 12
815: positiveSegments: 0, negativeSegments: 15
816: positiveSegments: 0, negativeSegments: 12
819: positiveSegments: 11, negativeSegments: 0
822: positiveSegments: 4, negativeSegments: 15
825: positiveSegments: 0, negativeSegments: 6
827: positiveSegments: 3, negativeSegments: 0
830: positiveSegments: 0, negativeSegments: 6
831: positiveSegments: 0, negativeSegments: 3
833: positiveSegments: 11, negativeSegments: 3
835: positiveSegments: 0, negativeSegments: 3
841: positiveSegments: 0, negativeSegments: 9
843: positiveSegments: 15, negativeSegments: 9
846: positiveSegments: 14, negativeSegments: 0
847: positiveSegments: 0, negativeSegments: 15
851: positiveSegments: 22, negativeSegments: 3
852: positiveSegments: 0, negativeSegments: 3
853: positiveSegments: 4, negativeSegments: 6
855: positiveSegments: 7, negativeSegments: 3
859: positiveSegments: 4, negativeSegments: 9
860: positiveSegments: 3, negativeSegments: 3
864: positiveSegments: 3, negativeSegments: 0
865: positiveSegments: 0, negativeSegments: 6
866: positiveSegments: 0, negativeSegments: 6
868: positiveSegments: 6, negativeSegments: 0
869: positiveSegments: 8, negativeSegments: 12
870: positiveSegments: 9, negativeSegments: 3
871: positiveSegments: 0, negativeSegments: 9
872: positiveSegments: 0, negativeSegments: 6
876: positiveSegments: 0, negativeSegments: 3
879: positiveSegments: 0, negativeSegments: 3
880: positiveSegments: 8, negativeSegments: 3
883: positiveSegments: 15, negativeSegments: 6
885: positiveSegments: 3, negativeSegments: 15
886: positiveSegments: 11, negativeSegments: 18
887: positiveSegments: 4, negativeSegments: 3
890: positiveSegments: 4, negativeSegments: 3
892: positiveSegments: 6, negativeSegments: 3
894: positiveSegments: 4, negativeSegments: 0
898: positiveSegments: 0, negativeSegments: 3
907: positiveSegments: 8, negativeSegments: 9
912: positiveSegments: 4, negativeSegments: 3
913: positiveSegments: 0, negativeSegments: 3
916: positiveSegments: 15, negativeSegments: 6
919: positiveSegments: 0, negativeSegments: 6
922: positiveSegments: 0, negativeSegments: 3
931: positiveSegments: 1, negativeSegments: 0
932: positiveSegments: 4, negativeSegments: 3
936: positiveSegments: 9, negativeSegments: 3
937: positiveSegments: 0, negativeSegments: 15
938: positiveSegments: 7, negativeSegments: 0
939: positiveSegments: 8, negativeSegments: 3
940: positiveSegments: 6, negativeSegments: 0
944: positiveSegments: 12, negativeSegments: 12
945: positiveSegments: 20, negativeSegments: 3
946: positiveSegments: 9, negativeSegments: 0
947: positiveSegments: 0, negativeSegments: 15
948: positiveSegments: 4, negativeSegments: 6
949: positiveSegments: 8, negativeSegments: 3
count processed: 400, current case index: 954
954: positiveSegments: 4, negativeSegments: 6
957: positiveSegments: 0, negativeSegments: 9
958: positiveSegments: 7, negativeSegments: 0
959: positiveSegments: 0, negativeSegments: 12
963: positiveSegments: 0, negativeSegments: 12
969: positiveSegments: 0, negativeSegments: 3
971: positiveSegments: 4, negativeSegments: 9
972: positiveSegments: 7, negativeSegments: 0
973: positiveSegments: 4, negativeSegments: 0
976: positiveSegments: 0, negativeSegments: 15
977: positiveSegments: 0, negativeSegments: 3
979: positiveSegments: 4, negativeSegments: 0
980: positiveSegments: 8, negativeSegments: 18
983: positiveSegments: 0, negativeSegments: 6
984: positiveSegments: 3, negativeSegments: 0
985: positiveSegments: 8, negativeSegments: 6
986: positiveSegments: 0, negativeSegments: 6
988: positiveSegments: 1, negativeSegments: 9
990: positiveSegments: 13, negativeSegments: 0
991: positiveSegments: 10, negativeSegments: 0
992: positiveSegments: 4, negativeSegments: 12
994: positiveSegments: 2, negativeSegments: 0
995: positiveSegments: 8, negativeSegments: 3
1002: positiveSegments: 3, negativeSegments: 6
1003: positiveSegments: 0, negativeSegments: 3
1005: positiveSegments: 3, negativeSegments: 12
1012: positiveSegments: 4, negativeSegments: 9
1013: positiveSegments: 4, negativeSegments: 6
1015: positiveSegments: 1, negativeSegments: 0
1016: positiveSegments: 0, negativeSegments: 6
1017: positiveSegments: 0, negativeSegments: 3
1018: positiveSegments: 12, negativeSegments: 9
1020: positiveSegments: 0, negativeSegments: 6
1022: positiveSegments: 0, negativeSegments: 9
1024: positiveSegments: 2, negativeSegments: 0
1025: positiveSegments: 10, negativeSegments: 15
1026: positiveSegments: 12, negativeSegments: 12
1027: positiveSegments: 11, negativeSegments: 3
1028: positiveSegments: 0, negativeSegments: 3
1029: positiveSegments: 0, negativeSegments: 6
1030: positiveSegments: 0, negativeSegments: 6
1032: positiveSegments: 6, negativeSegments: 0
1033: positiveSegments: 0, negativeSegments: 3
1034: positiveSegments: 0, negativeSegments: 3
1035: positiveSegments: 0, negativeSegments: 9
1037: positiveSegments: 0, negativeSegments: 21
1038: positiveSegments: 0, negativeSegments: 6
1040: positiveSegments: 4, negativeSegments: 3
1041: positiveSegments: 0, negativeSegments: 6
1043: positiveSegments: 0, negativeSegments: 3
1044: positiveSegments: 3, negativeSegments: 9
1046: positiveSegments: 9, negativeSegments: 0
1047: positiveSegments: 0, negativeSegments: 3
1049: positiveSegments: 1, negativeSegments: 6
1050: positiveSegments: 0, negativeSegments: 3
1051: positiveSegments: 0, negativeSegments: 3
1055: positiveSegments: 0, negativeSegments: 3
1056: positiveSegments: 8, negativeSegments: 0
1061: positiveSegments: 0, negativeSegments: 9
1063: positiveSegments: 10, negativeSegments: 3
1069: positiveSegments: 22, negativeSegments: 3
1073: positiveSegments: 8, negativeSegments: 0
1074: positiveSegments: 7, negativeSegments: 0
1076: positiveSegments: 12, negativeSegments: 3
1078: positiveSegments: 3, negativeSegments: 6
1081: positiveSegments: 4, negativeSegments: 0
1083: positiveSegments: 5, negativeSegments: 0
1084: positiveSegments: 0, negativeSegments: 9
1086: positiveSegments: 18, negativeSegments: 6
1087: positiveSegments: 4, negativeSegments: 6
1088: positiveSegments: 0, negativeSegments: 6
1089: positiveSegments: 0, negativeSegments: 3
1090: positiveSegments: 6, negativeSegments: 3
1093: positiveSegments: 4, negativeSegments: 6
1094: positiveSegments: 17, negativeSegments: 15
1095: positiveSegments: 0, negativeSegments: 21
1096: positiveSegments: 0, negativeSegments: 9
1097: positiveSegments: 9, negativeSegments: 0
1098: positiveSegments: 8, negativeSegments: 0
1102: positiveSegments: 6, negativeSegments: 0
1108: positiveSegments: 6, negativeSegments: 3
1109: positiveSegments: 0, negativeSegments: 3
1113: positiveSegments: 8, negativeSegments: 12
1114: positiveSegments: 13, negativeSegments: 0
1115: positiveSegments: 0, negativeSegments: 12
1118: positiveSegments: 8, negativeSegments: 12
1123: positiveSegments: 14, negativeSegments: 6
1124: positiveSegments: 8, negativeSegments: 0
1125: positiveSegments: 4, negativeSegments: 9
1127: positiveSegments: 10, negativeSegments: 3
1131: positiveSegments: 0, negativeSegments: 6
1132: positiveSegments: 20, negativeSegments: 9
1135: positiveSegments: 0, negativeSegments: 6
1138: positiveSegments: 4, negativeSegments: 0
1139: positiveSegments: 0, negativeSegments: 3
1143: positiveSegments: 5, negativeSegments: 0
1145: positiveSegments: 0, negativeSegments: 12
1154: positiveSegments: 0, negativeSegments: 6
1158: positiveSegments: 4, negativeSegments: 0
1159: positiveSegments: 10, negativeSegments: 3
count processed: 500, current case index: 1160
1160: positiveSegments: 11, negativeSegments: 0
1165: positiveSegments: 9, negativeSegments: 0
1166: positiveSegments: 9, negativeSegments: 3
1170: positiveSegments: 0, negativeSegments: 6
1174: positiveSegments: 4, negativeSegments: 0
1176: positiveSegments: 0, negativeSegments: 9
1180: positiveSegments: 8, negativeSegments: 9
1181: positiveSegments: 4, negativeSegments: 0
1182: positiveSegments: 8, negativeSegments: 9
1184: positiveSegments: 8, negativeSegments: 0
1185: positiveSegments: 22, negativeSegments: 6
1186: positiveSegments: 0, negativeSegments: 3
1187: positiveSegments: 0, negativeSegments: 3
1189: positiveSegments: 15, negativeSegments: 0
1191: positiveSegments: 23, negativeSegments: 6
1193: positiveSegments: 10, negativeSegments: 3
1194: positiveSegments: 8, negativeSegments: 3
1196: positiveSegments: 4, negativeSegments: 0
1199: positiveSegments: 4, negativeSegments: 0
1200: positiveSegments: 8, negativeSegments: 6
1201: positiveSegments: 0, negativeSegments: 3
1202: positiveSegments: 0, negativeSegments: 3
1204: positiveSegments: 12, negativeSegments: 3
1205: positiveSegments: 5, negativeSegments: 6
1208: positiveSegments: 6, negativeSegments: 0
1209: positiveSegments: 5, negativeSegments: 0
1213: positiveSegments: 0, negativeSegments: 6
1215: positiveSegments: 17, negativeSegments: 3
1216: positiveSegments: 16, negativeSegments: 21
1217: positiveSegments: 8, negativeSegments: 0
1219: positiveSegments: 4, negativeSegments: 3
1221: positiveSegments: 8, negativeSegments: 9
1222: positiveSegments: 4, negativeSegments: 9
1224: positiveSegments: 4, negativeSegments: 0
1225: positiveSegments: 0, negativeSegments: 3
1228: positiveSegments: 13, negativeSegments: 0
1229: positiveSegments: 12, negativeSegments: 0
1230: positiveSegments: 6, negativeSegments: 15
1232: positiveSegments: 3, negativeSegments: 3
1233: positiveSegments: 4, negativeSegments: 0
1234: positiveSegments: 0, negativeSegments: 6
1236: positiveSegments: 10, negativeSegments: 0
1237: positiveSegments: 4, negativeSegments: 0
1239: positiveSegments: 0, negativeSegments: 9
1240: positiveSegments: 4, negativeSegments: 6
1244: positiveSegments: 8, negativeSegments: 12
1248: positiveSegments: 0, negativeSegments: 12
1249: positiveSegments: 1, negativeSegments: 6
1256: positiveSegments: 0, negativeSegments: 9
1261: positiveSegments: 10, negativeSegments: 9
1263: positiveSegments: 8, negativeSegments: 9
1265: positiveSegments: 0, negativeSegments: 9
1267: positiveSegments: 0, negativeSegments: 9
1268: positiveSegments: 0, negativeSegments: 6
1272: positiveSegments: 10, negativeSegments: 0
1275: positiveSegments: 11, negativeSegments: 3
1277: positiveSegments: 0, negativeSegments: 9
1279: positiveSegments: 0, negativeSegments: 3
1280: positiveSegments: 0, negativeSegments: 3
1285: positiveSegments: 3, negativeSegments: 6
1286: positiveSegments: 25, negativeSegments: 12
1290: positiveSegments: 0, negativeSegments: 6
1291: positiveSegments: 16, negativeSegments: 6
1292: positiveSegments: 14, negativeSegments: 6
1293: positiveSegments: 12, negativeSegments: 0
1297: positiveSegments: 4, negativeSegments: 0
1298: positiveSegments: 0, negativeSegments: 6
1300: positiveSegments: 2, negativeSegments: 0
1302: positiveSegments: 0, negativeSegments: 9
1303: positiveSegments: 0, negativeSegments: 6
1305: positiveSegments: 10, negativeSegments: 0
1307: positiveSegments: 4, negativeSegments: 12
1309: positiveSegments: 0, negativeSegments: 9
1311: positiveSegments: 0, negativeSegments: 6
1313: positiveSegments: 6, negativeSegments: 6
1315: positiveSegments: 4, negativeSegments: 6
1316: positiveSegments: 4, negativeSegments: 3
1317: positiveSegments: 4, negativeSegments: 0
1319: positiveSegments: 4, negativeSegments: 3
1320: positiveSegments: 4, negativeSegments: 0
1321: positiveSegments: 0, negativeSegments: 12
1323: positiveSegments: 3, negativeSegments: 0
1324: positiveSegments: 3, negativeSegments: 9
1325: positiveSegments: 12, negativeSegments: 9
1333: positiveSegments: 0, negativeSegments: 6
1335: positiveSegments: 10, negativeSegments: 9
1339: positiveSegments: 0, negativeSegments: 3
1341: positiveSegments: 0, negativeSegments: 3
1343: positiveSegments: 4, negativeSegments: 3
1344: positiveSegments: 0, negativeSegments: 9
1346: positiveSegments: 0, negativeSegments: 12
1347: positiveSegments: 0, negativeSegments: 9
1350: positiveSegments: 0, negativeSegments: 15
1353: positiveSegments: 4, negativeSegments: 0
1356: positiveSegments: 5, negativeSegments: 0
1358: positiveSegments: 0, negativeSegments: 6
1359: positiveSegments: 18, negativeSegments: 6
1362: positiveSegments: 8, negativeSegments: 0
1364: positiveSegments: 14, negativeSegments: 0
1365: positiveSegments: 8, negativeSegments: 0
count processed: 600, current case index: 1367
1367: positiveSegments: 23, negativeSegments: 3
1368: positiveSegments: 4, negativeSegments: 0
1374: positiveSegments: 16, negativeSegments: 15
1375: positiveSegments: 4, negativeSegments: 3
1376: positiveSegments: 0, negativeSegments: 3
1381: positiveSegments: 8, negativeSegments: 3
1383: positiveSegments: 10, negativeSegments: 6
1386: positiveSegments: 0, negativeSegments: 3
1389: positiveSegments: 0, negativeSegments: 15
1396: positiveSegments: 4, negativeSegments: 9
1397: positiveSegments: 10, negativeSegments: 3
1398: positiveSegments: 0, negativeSegments: 6
1399: positiveSegments: 11, negativeSegments: 3
1402: positiveSegments: 11, negativeSegments: 0
1403: positiveSegments: 18, negativeSegments: 24
1404: positiveSegments: 0, negativeSegments: 3
1407: positiveSegments: 29, negativeSegments: 3
1408: positiveSegments: 0, negativeSegments: 3
1414: positiveSegments: 4, negativeSegments: 0
1415: positiveSegments: 4, negativeSegments: 3
1416: positiveSegments: 18, negativeSegments: 3
1417: positiveSegments: 0, negativeSegments: 3
1421: positiveSegments: 4, negativeSegments: 6
1422: positiveSegments: 0, negativeSegments: 15
1426: positiveSegments: 4, negativeSegments: 6
1428: positiveSegments: 4, negativeSegments: 6
1432: positiveSegments: 13, negativeSegments: 6
1434: positiveSegments: 19, negativeSegments: 3
1436: positiveSegments: 5, negativeSegments: 12
1438: positiveSegments: 7, negativeSegments: 0
1442: positiveSegments: 4, negativeSegments: 12
1446: positiveSegments: 6, negativeSegments: 3
1452: positiveSegments: 0, negativeSegments: 6
1454: positiveSegments: 0, negativeSegments: 18
1458: positiveSegments: 4, negativeSegments: 0
1463: positiveSegments: 0, negativeSegments: 9
1465: positiveSegments: 4, negativeSegments: 3
1468: positiveSegments: 0, negativeSegments: 18
1469: positiveSegments: 28, negativeSegments: 3
1470: positiveSegments: 4, negativeSegments: 0
1471: positiveSegments: 4, negativeSegments: 0
1473: positiveSegments: 0, negativeSegments: 6
1474: positiveSegments: 0, negativeSegments: 6
1475: positiveSegments: 22, negativeSegments: 3
1478: positiveSegments: 0, negativeSegments: 6
1479: positiveSegments: 0, negativeSegments: 6
1482: positiveSegments: 0, negativeSegments: 18
1485: positiveSegments: 22, negativeSegments: 3
1486: positiveSegments: 15, negativeSegments: 6
1487: positiveSegments: 0, negativeSegments: 3
1488: positiveSegments: 4, negativeSegments: 0
1489: positiveSegments: 15, negativeSegments: 3
1490: positiveSegments: 4, negativeSegments: 3
1492: positiveSegments: 38, negativeSegments: 0
1493: positiveSegments: 12, negativeSegments: 9
1496: positiveSegments: 2, negativeSegments: 3
1497: positiveSegments: 7, negativeSegments: 6
1498: positiveSegments: 0, negativeSegments: 6
1500: positiveSegments: 0, negativeSegments: 6
1503: positiveSegments: 4, negativeSegments: 0
1512: positiveSegments: 0, negativeSegments: 3
1515: positiveSegments: 33, negativeSegments: 3
1520: positiveSegments: 8, negativeSegments: 0
1521: positiveSegments: 8, negativeSegments: 9
1522: positiveSegments: 0, negativeSegments: 9
1523: positiveSegments: 8, negativeSegments: 24
1525: positiveSegments: 18, negativeSegments: 3
1526: positiveSegments: 0, negativeSegments: 9
1527: positiveSegments: 0, negativeSegments: 6
1536: positiveSegments: 0, negativeSegments: 3
1537: positiveSegments: 8, negativeSegments: 0
1539: positiveSegments: 4, negativeSegments: 24
1540: positiveSegments: 1, negativeSegments: 0
1541: positiveSegments: 12, negativeSegments: 0
1542: positiveSegments: 8, negativeSegments: 6
1545: positiveSegments: 10, negativeSegments: 3
1546: positiveSegments: 0, negativeSegments: 3
1548: positiveSegments: 0, negativeSegments: 9
1549: positiveSegments: 7, negativeSegments: 6
1552: positiveSegments: 4, negativeSegments: 0
1555: positiveSegments: 0, negativeSegments: 3
1556: positiveSegments: 4, negativeSegments: 18
1558: positiveSegments: 16, negativeSegments: 0
1559: positiveSegments: 8, negativeSegments: 6
1561: positiveSegments: 0, negativeSegments: 3
1562: positiveSegments: 2, negativeSegments: 9
1564: positiveSegments: 27, negativeSegments: 3
1566: positiveSegments: 4, negativeSegments: 6
1567: positiveSegments: 5, negativeSegments: 0
1568: positiveSegments: 8, negativeSegments: 0
1574: positiveSegments: 0, negativeSegments: 9
1575: positiveSegments: 0, negativeSegments: 6
1580: positiveSegments: 9, negativeSegments: 3
1581: positiveSegments: 0, negativeSegments: 6
1583: positiveSegments: 16, negativeSegments: 6
1585: positiveSegments: 0, negativeSegments: 3
1586: positiveSegments: 4, negativeSegments: 12
1590: positiveSegments: 20, negativeSegments: 0
1591: positiveSegments: 7, negativeSegments: 6
1594: positiveSegments: 4, negativeSegments: 0
count processed: 700, current case index: 1595
1595: positiveSegments: 4, negativeSegments: 12
1596: positiveSegments: 0, negativeSegments: 3
1597: positiveSegments: 4, negativeSegments: 15
1599: positiveSegments: 11, negativeSegments: 15
1600: exit early, no segments to save
1600: positiveSegments: 0, negativeSegments: 0
1602: positiveSegments: 4, negativeSegments: 18
1605: positiveSegments: 16, negativeSegments: 6
1608: positiveSegments: 2, negativeSegments: 0
1610: positiveSegments: 0, negativeSegments: 3
1613: positiveSegments: 0, negativeSegments: 3
1614: positiveSegments: 0, negativeSegments: 9
1615: positiveSegments: 1, negativeSegments: 12
1616: positiveSegments: 4, negativeSegments: 6
1618: positiveSegments: 4, negativeSegments: 3
1620: positiveSegments: 8, negativeSegments: 0
1623: positiveSegments: 2, negativeSegments: 0
1630: positiveSegments: 12, negativeSegments: 3
1632: positiveSegments: 2, negativeSegments: 0
1633: positiveSegments: 11, negativeSegments: 0
1636: positiveSegments: 4, negativeSegments: 3
1639: positiveSegments: 7, negativeSegments: 0
1641: positiveSegments: 4, negativeSegments: 12
1642: positiveSegments: 8, negativeSegments: 0
1647: positiveSegments: 6, negativeSegments: 3
1648: positiveSegments: 0, negativeSegments: 3
1656: positiveSegments: 0, negativeSegments: 3
1657: positiveSegments: 0, negativeSegments: 12
1658: positiveSegments: 4, negativeSegments: 6
1665: positiveSegments: 8, negativeSegments: 0
1666: positiveSegments: 8, negativeSegments: 9
1671: positiveSegments: 15, negativeSegments: 3
1672: positiveSegments: 0, negativeSegments: 12
1673: positiveSegments: 9, negativeSegments: 0
1674: positiveSegments: 10, negativeSegments: 0
1684: positiveSegments: 4, negativeSegments: 0
1685: positiveSegments: 0, negativeSegments: 3
1689: positiveSegments: 0, negativeSegments: 6
1690: positiveSegments: 4, negativeSegments: 6
1694: positiveSegments: 14, negativeSegments: 3
1695: positiveSegments: 4, negativeSegments: 3
1696: positiveSegments: 0, negativeSegments: 18
1699: positiveSegments: 0, negativeSegments: 3
1700: positiveSegments: 0, negativeSegments: 6
1703: positiveSegments: 13, negativeSegments: 12
1705: positiveSegments: 4, negativeSegments: 12
1706: positiveSegments: 0, negativeSegments: 9
1708: positiveSegments: 4, negativeSegments: 12
1710: positiveSegments: 8, negativeSegments: 3
1714: positiveSegments: 0, negativeSegments: 9
1716: positiveSegments: 16, negativeSegments: 0
1718: positiveSegments: 15, negativeSegments: 3
1719: positiveSegments: 4, negativeSegments: 0
1722: positiveSegments: 0, negativeSegments: 18
1724: positiveSegments: 5, negativeSegments: 3
1726: positiveSegments: 2, negativeSegments: 9
1728: positiveSegments: 0, negativeSegments: 9
1729: positiveSegments: 4, negativeSegments: 12
1730: positiveSegments: 10, negativeSegments: 0
1732: positiveSegments: 0, negativeSegments: 6
1733: positiveSegments: 15, negativeSegments: 0
1735: positiveSegments: 0, negativeSegments: 6
1737: positiveSegments: 4, negativeSegments: 6
1738: positiveSegments: 11, negativeSegments: 0
1743: positiveSegments: 4, negativeSegments: 9
1745: positiveSegments: 21, negativeSegments: 6
1747: positiveSegments: 0, negativeSegments: 6
1748: positiveSegments: 0, negativeSegments: 3
1749: positiveSegments: 0, negativeSegments: 6
1752: positiveSegments: 37, negativeSegments: 3
1753: positiveSegments: 1, negativeSegments: 0
1756: positiveSegments: 0, negativeSegments: 6
1757: positiveSegments: 0, negativeSegments: 6
1759: positiveSegments: 0, negativeSegments: 3
1761: positiveSegments: 3, negativeSegments: 0
1762: positiveSegments: 0, negativeSegments: 12
1763: positiveSegments: 4, negativeSegments: 3
1765: positiveSegments: 4, negativeSegments: 3
1766: positiveSegments: 12, negativeSegments: 0
1768: positiveSegments: 4, negativeSegments: 0
1771: positiveSegments: 6, negativeSegments: 0
1773: positiveSegments: 6, negativeSegments: 0
1775: positiveSegments: 3, negativeSegments: 0
1777: positiveSegments: 0, negativeSegments: 12
1779: positiveSegments: 2, negativeSegments: 12
1783: positiveSegments: 10, negativeSegments: 3
1784: positiveSegments: 7, negativeSegments: 0
1785: positiveSegments: 21, negativeSegments: 0
1793: positiveSegments: 11, negativeSegments: 0
1799: positiveSegments: 33, negativeSegments: 3
1800: positiveSegments: 0, negativeSegments: 3
1802: positiveSegments: 0, negativeSegments: 6
1803: positiveSegments: 39, negativeSegments: 21
1805: positiveSegments: 4, negativeSegments: 0
1809: positiveSegments: 0, negativeSegments: 12
1810: positiveSegments: 0, negativeSegments: 3
1812: positiveSegments: 8, negativeSegments: 0
1814: positiveSegments: 2, negativeSegments: 3
1816: positiveSegments: 18, negativeSegments: 15
1819: positiveSegments: 19, negativeSegments: 0
1820: positiveSegments: 11, negativeSegments: 0
count processed: 800, current case index: 1822
1822: positiveSegments: 9, negativeSegments: 3
1823: positiveSegments: 0, negativeSegments: 3
1825: positiveSegments: 0, negativeSegments: 3
1826: positiveSegments: 3, negativeSegments: 0
1832: positiveSegments: 26, negativeSegments: 3
1833: positiveSegments: 19, negativeSegments: 0
1834: positiveSegments: 6, negativeSegments: 0
1835: positiveSegments: 12, negativeSegments: 0
1836: positiveSegments: 4, negativeSegments: 0
1838: positiveSegments: 10, negativeSegments: 6
1840: positiveSegments: 1, negativeSegments: 3
1843: positiveSegments: 14, negativeSegments: 6
1844: positiveSegments: 8, negativeSegments: 0
1846: positiveSegments: 2, negativeSegments: 0
1848: positiveSegments: 6, negativeSegments: 6
1852: positiveSegments: 9, negativeSegments: 3
1853: positiveSegments: 0, negativeSegments: 9
1854: positiveSegments: 11, negativeSegments: 0
1855: positiveSegments: 24, negativeSegments: 0
1862: positiveSegments: 15, negativeSegments: 9
1865: positiveSegments: 5, negativeSegments: 0
1866: positiveSegments: 4, negativeSegments: 0
1869: positiveSegments: 0, negativeSegments: 3
1872: positiveSegments: 12, negativeSegments: 0
1873: positiveSegments: 8, negativeSegments: 6
1874: positiveSegments: 8, negativeSegments: 3
1882: positiveSegments: 0, negativeSegments: 3
1884: positiveSegments: 12, negativeSegments: 9
1885: positiveSegments: 0, negativeSegments: 3
1886: positiveSegments: 8, negativeSegments: 6
1888: positiveSegments: 0, negativeSegments: 3
1891: positiveSegments: 0, negativeSegments: 9
1892: positiveSegments: 13, negativeSegments: 9
1893: positiveSegments: 18, negativeSegments: 9
1894: positiveSegments: 4, negativeSegments: 3
1896: positiveSegments: 6, negativeSegments: 3
1899: positiveSegments: 0, negativeSegments: 9
1900: positiveSegments: 21, negativeSegments: 3
1901: positiveSegments: 16, negativeSegments: 6
1903: positiveSegments: 16, negativeSegments: 6
1910: positiveSegments: 3, negativeSegments: 0
1912: positiveSegments: 6, negativeSegments: 21
1914: positiveSegments: 10, negativeSegments: 3
1915: positiveSegments: 2, negativeSegments: 0
1916: positiveSegments: 0, negativeSegments: 15
1918: positiveSegments: 4, negativeSegments: 9
1920: positiveSegments: 12, negativeSegments: 3
1922: positiveSegments: 0, negativeSegments: 6
1925: positiveSegments: 0, negativeSegments: 3
1926: positiveSegments: 4, negativeSegments: 0
1928: positiveSegments: 12, negativeSegments: 0
1932: positiveSegments: 26, negativeSegments: 6
1934: positiveSegments: 23, negativeSegments: 0
1935: positiveSegments: 2, negativeSegments: 12
1936: positiveSegments: 44, negativeSegments: 3
1937: positiveSegments: 8, negativeSegments: 9
1938: positiveSegments: 8, negativeSegments: 12
1941: positiveSegments: 24, negativeSegments: 6
1942: positiveSegments: 6, negativeSegments: 0
1944: positiveSegments: 8, negativeSegments: 6
1947: positiveSegments: 0, negativeSegments: 3
1949: positiveSegments: 4, negativeSegments: 6
1950: positiveSegments: 0, negativeSegments: 3
1955: positiveSegments: 2, negativeSegments: 6
1956: positiveSegments: 4, negativeSegments: 3
1957: positiveSegments: 4, negativeSegments: 0
1959: positiveSegments: 9, negativeSegments: 6
1961: positiveSegments: 18, negativeSegments: 12
1963: positiveSegments: 19, negativeSegments: 6
1965: positiveSegments: 10, negativeSegments: 0
1966: positiveSegments: 6, negativeSegments: 0
1969: positiveSegments: 0, negativeSegments: 6
1973: positiveSegments: 0, negativeSegments: 9
1976: positiveSegments: 20, negativeSegments: 12
1978: positiveSegments: 8, negativeSegments: 6
1985: positiveSegments: 4, negativeSegments: 0
1988: positiveSegments: 0, negativeSegments: 6
1993: positiveSegments: 0, negativeSegments: 6
1994: positiveSegments: 2, negativeSegments: 3
1995: positiveSegments: 6, negativeSegments: 0
1996: positiveSegments: 8, negativeSegments: 3
2000: positiveSegments: 0, negativeSegments: 3
2002: positiveSegments: 2, negativeSegments: 6
2004: positiveSegments: 2, negativeSegments: 6
2010: positiveSegments: 0, negativeSegments: 3
2011: positiveSegments: 0, negativeSegments: 18
2012: positiveSegments: 2, negativeSegments: 0
2014: positiveSegments: 0, negativeSegments: 9
2017: positiveSegments: 12, negativeSegments: 0
2018: positiveSegments: 18, negativeSegments: 3
2020: positiveSegments: 7, negativeSegments: 0
2025: positiveSegments: 4, negativeSegments: 0
2026: positiveSegments: 11, negativeSegments: 0
2028: positiveSegments: 0, negativeSegments: 21
2029: positiveSegments: 0, negativeSegments: 9
2040: positiveSegments: 10, negativeSegments: 9
2041: positiveSegments: 2, negativeSegments: 3
2044: positiveSegments: 4, negativeSegments: 9
2046: positiveSegments: 4, negativeSegments: 0
2049: positiveSegments: 0, negativeSegments: 3
count processed: 900, current case index: 2055
2055: positiveSegments: 8, negativeSegments: 12
2057: positiveSegments: 9, negativeSegments: 0
2058: positiveSegments: 6, negativeSegments: 0
2060: positiveSegments: 0, negativeSegments: 12
2061: positiveSegments: 4, negativeSegments: 0
2062: positiveSegments: 0, negativeSegments: 12
2064: positiveSegments: 16, negativeSegments: 0
2066: positiveSegments: 7, negativeSegments: 6
2067: positiveSegments: 15, negativeSegments: 3
2068: positiveSegments: 10, negativeSegments: 0
2072: positiveSegments: 2, negativeSegments: 6
2075: positiveSegments: 0, negativeSegments: 3
2081: positiveSegments: 2, negativeSegments: 0
2082: positiveSegments: 8, negativeSegments: 9
2086: positiveSegments: 8, negativeSegments: 0
2088: positiveSegments: 0, negativeSegments: 3
2097: positiveSegments: 3, negativeSegments: 0
2098: positiveSegments: 0, negativeSegments: 3
2106: positiveSegments: 0, negativeSegments: 15
2112: positiveSegments: 7, negativeSegments: 0
2114: positiveSegments: 0, negativeSegments: 3
2117: positiveSegments: 0, negativeSegments: 3
2118: positiveSegments: 0, negativeSegments: 12
2121: positiveSegments: 8, negativeSegments: 3
2130: positiveSegments: 0, negativeSegments: 12
2132: positiveSegments: 7, negativeSegments: 0
2133: positiveSegments: 4, negativeSegments: 0
2136: positiveSegments: 21, negativeSegments: 9
2139: positiveSegments: 0, negativeSegments: 6
2142: positiveSegments: 4, negativeSegments: 0
2147: positiveSegments: 5, negativeSegments: 9
2148: positiveSegments: 0, negativeSegments: 6
2149: positiveSegments: 4, negativeSegments: 6
2150: positiveSegments: 0, negativeSegments: 6
2153: positiveSegments: 33, negativeSegments: 12
2154: positiveSegments: 0, negativeSegments: 12
2158: exit early, no segments to save
2158: positiveSegments: 0, negativeSegments: 0
2161: positiveSegments: 16, negativeSegments: 3
2163: positiveSegments: 0, negativeSegments: 3
2165: positiveSegments: 14, negativeSegments: 12
2168: positiveSegments: 27, negativeSegments: 3
2169: positiveSegments: 0, negativeSegments: 12
2172: positiveSegments: 0, negativeSegments: 18
2174: positiveSegments: 0, negativeSegments: 3
2175: positiveSegments: 0, negativeSegments: 3
2176: positiveSegments: 8, negativeSegments: 0
2183: positiveSegments: 0, negativeSegments: 9
2185: positiveSegments: 4, negativeSegments: 9
2187: positiveSegments: 2, negativeSegments: 0
2192: positiveSegments: 14, negativeSegments: 9
2194: positiveSegments: 0, negativeSegments: 18
2195: positiveSegments: 4, negativeSegments: 6
2196: positiveSegments: 0, negativeSegments: 3
2197: positiveSegments: 25, negativeSegments: 6
2201: positiveSegments: 16, negativeSegments: 0
2205: positiveSegments: 0, negativeSegments: 3
2206: positiveSegments: 0, negativeSegments: 3
2210: positiveSegments: 8, negativeSegments: 3
2213: positiveSegments: 0, negativeSegments: 3
2214: positiveSegments: 7, negativeSegments: 3
2221: positiveSegments: 4, negativeSegments: 0
2222: positiveSegments: 4, negativeSegments: 0
2224: exit early, no segments to save
2224: positiveSegments: 0, negativeSegments: 0
2225: positiveSegments: 8, negativeSegments: 12
2229: positiveSegments: 0, negativeSegments: 6
2231: positiveSegments: 8, negativeSegments: 6
2236: positiveSegments: 11, negativeSegments: 3
2238: positiveSegments: 0, negativeSegments: 18
2241: positiveSegments: 0, negativeSegments: 9
2242: positiveSegments: 0, negativeSegments: 6
2243: positiveSegments: 20, negativeSegments: 3
2244: positiveSegments: 0, negativeSegments: 6
2246: positiveSegments: 4, negativeSegments: 3
2248: positiveSegments: 3, negativeSegments: 3
2249: positiveSegments: 1, negativeSegments: 0
2251: positiveSegments: 2, negativeSegments: 0
2252: positiveSegments: 4, negativeSegments: 9
2258: positiveSegments: 4, negativeSegments: 3
2261: positiveSegments: 7, negativeSegments: 0
2265: positiveSegments: 7, negativeSegments: 0
2267: positiveSegments: 12, negativeSegments: 3
2272: positiveSegments: 15, negativeSegments: 0
2273: positiveSegments: 0, negativeSegments: 12
2275: positiveSegments: 7, negativeSegments: 3
2279: positiveSegments: 8, negativeSegments: 0
2280: positiveSegments: 8, negativeSegments: 6
2282: positiveSegments: 0, negativeSegments: 3
2291: positiveSegments: 4, negativeSegments: 0
2296: positiveSegments: 0, negativeSegments: 9
2298: positiveSegments: 4, negativeSegments: 3
2299: positiveSegments: 4, negativeSegments: 3
2300: positiveSegments: 4, negativeSegments: 18
2302: positiveSegments: 4, negativeSegments: 9
2304: positiveSegments: 6, negativeSegments: 6
2306: positiveSegments: 5, negativeSegments: 6
2307: positiveSegments: 0, negativeSegments: 12
2309: positiveSegments: 4, negativeSegments: 0
2310: positiveSegments: 0, negativeSegments: 15
2311: positiveSegments: 6, negativeSegments: 0
2315: positiveSegments: 8, negativeSegments: 0
count processed: 1000, current case index: 2317
2317: positiveSegments: 4, negativeSegments: 0
2318: positiveSegments: 22, negativeSegments: 0
2319: positiveSegments: 0, negativeSegments: 6
2321: positiveSegments: 4, negativeSegments: 3
2324: positiveSegments: 11, negativeSegments: 3
2325: positiveSegments: 16, negativeSegments: 3
2326: positiveSegments: 26, negativeSegments: 0
2327: positiveSegments: 6, negativeSegments: 0
2331: positiveSegments: 5, negativeSegments: 0
2332: positiveSegments: 16, negativeSegments: 6
2333: positiveSegments: 4, negativeSegments: 6
2334: positiveSegments: 9, negativeSegments: 3
2335: positiveSegments: 4, negativeSegments: 0
2336: positiveSegments: 23, negativeSegments: 12
2337: positiveSegments: 4, negativeSegments: 6
2339: positiveSegments: 4, negativeSegments: 0
2340: positiveSegments: 15, negativeSegments: 0
2341: positiveSegments: 0, negativeSegments: 3
2345: positiveSegments: 4, negativeSegments: 3
2346: positiveSegments: 0, negativeSegments: 3
2348: positiveSegments: 4, negativeSegments: 9
2349: positiveSegments: 13, negativeSegments: 6
2352: positiveSegments: 3, negativeSegments: 12
2353: positiveSegments: 8, negativeSegments: 12
2354: positiveSegments: 0, negativeSegments: 12
2356: positiveSegments: 13, negativeSegments: 0
2357: positiveSegments: 0, negativeSegments: 6
2359: positiveSegments: 4, negativeSegments: 0
2365: positiveSegments: 3, negativeSegments: 0
2371: positiveSegments: 10, negativeSegments: 3
2372: positiveSegments: 0, negativeSegments: 9
2373: positiveSegments: 7, negativeSegments: 6
2375: positiveSegments: 8, negativeSegments: 15
2377: positiveSegments: 9, negativeSegments: 3
2380: positiveSegments: 0, negativeSegments: 12
2383: positiveSegments: 14, negativeSegments: 3
2389: positiveSegments: 12, negativeSegments: 9
2392: positiveSegments: 0, negativeSegments: 9
2393: positiveSegments: 8, negativeSegments: 3
2394: positiveSegments: 0, negativeSegments: 9
2396: positiveSegments: 16, negativeSegments: 0
2401: positiveSegments: 0, negativeSegments: 6
2405: positiveSegments: 4, negativeSegments: 3
2411: positiveSegments: 10, negativeSegments: 0
2413: exit early, no segments to save
2413: positiveSegments: 0, negativeSegments: 0
2416: positiveSegments: 4, negativeSegments: 9
2417: positiveSegments: 0, negativeSegments: 3
2419: positiveSegments: 4, negativeSegments: 0
2420: positiveSegments: 0, negativeSegments: 21
2421: positiveSegments: 0, negativeSegments: 9
2422: positiveSegments: 8, negativeSegments: 0
2424: positiveSegments: 6, negativeSegments: 3
2425: positiveSegments: 0, negativeSegments: 6
2427: positiveSegments: 0, negativeSegments: 9
2428: positiveSegments: 4, negativeSegments: 3
2432: positiveSegments: 9, negativeSegments: 0
2433: positiveSegments: 20, negativeSegments: 21
2435: positiveSegments: 0, negativeSegments: 3
2436: positiveSegments: 0, negativeSegments: 3
2438: positiveSegments: 0, negativeSegments: 3
2441: positiveSegments: 10, negativeSegments: 6
2442: positiveSegments: 4, negativeSegments: 0
2443: positiveSegments: 7, negativeSegments: 27
2444: positiveSegments: 4, negativeSegments: 0
2445: positiveSegments: 12, negativeSegments: 3
2447: positiveSegments: 0, negativeSegments: 6
2450: positiveSegments: 0, negativeSegments: 3
2452: positiveSegments: 14, negativeSegments: 0
2453: positiveSegments: 18, negativeSegments: 6
2455: positiveSegments: 0, negativeSegments: 3
2458: positiveSegments: 22, negativeSegments: 9
2462: positiveSegments: 9, negativeSegments: 3
2469: positiveSegments: 0, negativeSegments: 6
2470: positiveSegments: 22, negativeSegments: 0
2471: positiveSegments: 10, negativeSegments: 0
2472: positiveSegments: 4, negativeSegments: 9
2474: positiveSegments: 0, negativeSegments: 6
2480: positiveSegments: 3, negativeSegments: 9
2481: positiveSegments: 0, negativeSegments: 3
2482: positiveSegments: 0, negativeSegments: 3
2483: positiveSegments: 4, negativeSegments: 0
2485: positiveSegments: 0, negativeSegments: 6
2487: positiveSegments: 24, negativeSegments: 9
2489: positiveSegments: 22, negativeSegments: 6
2493: positiveSegments: 10, negativeSegments: 3
2494: positiveSegments: 19, negativeSegments: 0
2495: positiveSegments: 4, negativeSegments: 9
2496: positiveSegments: 0, negativeSegments: 6
2497: positiveSegments: 9, negativeSegments: 9
2501: positiveSegments: 0, negativeSegments: 18
2503: positiveSegments: 0, negativeSegments: 3
2507: positiveSegments: 0, negativeSegments: 6
2508: positiveSegments: 0, negativeSegments: 9
2516: positiveSegments: 0, negativeSegments: 3
2517: positiveSegments: 8, negativeSegments: 0
2519: positiveSegments: 17, negativeSegments: 3
2521: positiveSegments: 10, negativeSegments: 0
2523: positiveSegments: 6, negativeSegments: 3
2527: positiveSegments: 0, negativeSegments: 6
2532: positiveSegments: 4, negativeSegments: 0
count processed: 1100, current case index: 2533
2533: positiveSegments: 4, negativeSegments: 0
2535: positiveSegments: 4, negativeSegments: 0
2537: positiveSegments: 0, negativeSegments: 6
2539: positiveSegments: 3, negativeSegments: 0
2544: positiveSegments: 10, negativeSegments: 12
2547: positiveSegments: 6, negativeSegments: 6
2553: positiveSegments: 7, negativeSegments: 0
2558: positiveSegments: 8, negativeSegments: 3
2559: positiveSegments: 10, negativeSegments: 0
2561: positiveSegments: 0, negativeSegments: 3
2562: positiveSegments: 0, negativeSegments: 6
2566: positiveSegments: 0, negativeSegments: 3
2568: positiveSegments: 2, negativeSegments: 12
2578: positiveSegments: 4, negativeSegments: 6
2580: positiveSegments: 12, negativeSegments: 6
2583: positiveSegments: 4, negativeSegments: 0
2584: positiveSegments: 17, negativeSegments: 9
2585: positiveSegments: 9, negativeSegments: 0
2587: positiveSegments: 8, negativeSegments: 0
2589: positiveSegments: 0, negativeSegments: 6
2590: positiveSegments: 0, negativeSegments: 6
2594: positiveSegments: 6, negativeSegments: 3
2596: positiveSegments: 4, negativeSegments: 0
2601: positiveSegments: 0, negativeSegments: 9
2605: positiveSegments: 1, negativeSegments: 0
2606: positiveSegments: 8, negativeSegments: 9
2607: positiveSegments: 0, negativeSegments: 3
2608: positiveSegments: 6, negativeSegments: 0
2609: positiveSegments: 0, negativeSegments: 3
2611: positiveSegments: 16, negativeSegments: 3
2612: positiveSegments: 2, negativeSegments: 0
2613: positiveSegments: 0, negativeSegments: 3
2618: positiveSegments: 4, negativeSegments: 15
2619: positiveSegments: 8, negativeSegments: 3
2622: positiveSegments: 8, negativeSegments: 3
2624: positiveSegments: 4, negativeSegments: 6
2627: positiveSegments: 6, negativeSegments: 0
2628: positiveSegments: 0, negativeSegments: 12
2630: positiveSegments: 6, negativeSegments: 6
2631: positiveSegments: 3, negativeSegments: 0
2632: positiveSegments: 10, negativeSegments: 0
2637: positiveSegments: 0, negativeSegments: 3
2639: positiveSegments: 4, negativeSegments: 0
2641: positiveSegments: 0, negativeSegments: 6
2644: positiveSegments: 4, negativeSegments: 0
2648: positiveSegments: 8, negativeSegments: 9
2652: positiveSegments: 0, negativeSegments: 15
2654: positiveSegments: 0, negativeSegments: 6
2655: positiveSegments: 0, negativeSegments: 3
2656: positiveSegments: 0, negativeSegments: 12
2657: positiveSegments: 0, negativeSegments: 6
2658: positiveSegments: 4, negativeSegments: 6
2662: positiveSegments: 8, negativeSegments: 18
2663: positiveSegments: 0, negativeSegments: 6
2664: positiveSegments: 0, negativeSegments: 12
2665: positiveSegments: 0, negativeSegments: 3
2667: positiveSegments: 0, negativeSegments: 9
2671: positiveSegments: 7, negativeSegments: 3
2674: positiveSegments: 8, negativeSegments: 9
2676: positiveSegments: 2, negativeSegments: 0
2678: positiveSegments: 3, negativeSegments: 3
2680: positiveSegments: 9, negativeSegments: 0
2687: positiveSegments: 6, negativeSegments: 0
2688: positiveSegments: 12, negativeSegments: 3
2690: positiveSegments: 4, negativeSegments: 3
2693: positiveSegments: 11, negativeSegments: 6
2697: positiveSegments: 0, negativeSegments: 9
2698: positiveSegments: 0, negativeSegments: 3
2699: positiveSegments: 0, negativeSegments: 9
2700: positiveSegments: 2, negativeSegments: 0
2701: positiveSegments: 0, negativeSegments: 6
2703: positiveSegments: 0, negativeSegments: 15
2705: positiveSegments: 0, negativeSegments: 6
2706: positiveSegments: 32, negativeSegments: 3
2712: positiveSegments: 10, negativeSegments: 3
2713: positiveSegments: 0, negativeSegments: 3
2716: positiveSegments: 0, negativeSegments: 9
2717: positiveSegments: 8, negativeSegments: 0
2722: positiveSegments: 38, negativeSegments: 0
2724: positiveSegments: 4, negativeSegments: 12
2735: positiveSegments: 0, negativeSegments: 3
2738: positiveSegments: 7, negativeSegments: 3
2741: positiveSegments: 10, negativeSegments: 6
2742: positiveSegments: 11, negativeSegments: 3
2744: positiveSegments: 0, negativeSegments: 6
2746: positiveSegments: 4, negativeSegments: 0
2747: positiveSegments: 4, negativeSegments: 6
2749: positiveSegments: 4, negativeSegments: 3
2750: positiveSegments: 0, negativeSegments: 6
2751: positiveSegments: 21, negativeSegments: 6
2755: positiveSegments: 11, negativeSegments: 9
2760: positiveSegments: 0, negativeSegments: 3
2761: positiveSegments: 11, negativeSegments: 0
2762: positiveSegments: 0, negativeSegments: 3
2763: positiveSegments: 3, negativeSegments: 0
2764: positiveSegments: 16, negativeSegments: 6
2765: positiveSegments: 6, negativeSegments: 0
2766: positiveSegments: 0, negativeSegments: 15
2769: positiveSegments: 0, negativeSegments: 9
2774: positiveSegments: 3, negativeSegments: 0
count processed: 1200, current case index: 2775
2775: positiveSegments: 16, negativeSegments: 3
2777: positiveSegments: 4, negativeSegments: 3
2778: positiveSegments: 12, negativeSegments: 3
2779: positiveSegments: 4, negativeSegments: 3
2781: positiveSegments: 13, negativeSegments: 9
2783: positiveSegments: 12, negativeSegments: 0
2785: positiveSegments: 20, negativeSegments: 3
2795: positiveSegments: 8, negativeSegments: 6
2800: positiveSegments: 0, negativeSegments: 12
2803: positiveSegments: 2, negativeSegments: 0
2804: positiveSegments: 8, negativeSegments: 9
2806: positiveSegments: 1, negativeSegments: 0
2807: positiveSegments: 4, negativeSegments: 6
2809: positiveSegments: 8, negativeSegments: 3
2814: positiveSegments: 3, negativeSegments: 6
2815: positiveSegments: 15, negativeSegments: 0
2819: positiveSegments: 6, negativeSegments: 0
2820: positiveSegments: 4, negativeSegments: 9
2823: positiveSegments: 9, negativeSegments: 6
2824: positiveSegments: 15, negativeSegments: 9
2827: positiveSegments: 12, negativeSegments: 0
2828: positiveSegments: 20, negativeSegments: 6
2829: positiveSegments: 13, negativeSegments: 3
2830: positiveSegments: 4, negativeSegments: 0
2835: positiveSegments: 18, negativeSegments: 0
2836: positiveSegments: 0, negativeSegments: 12
2837: positiveSegments: 4, negativeSegments: 6
2839: positiveSegments: 3, negativeSegments: 3
2845: positiveSegments: 0, negativeSegments: 3
2847: positiveSegments: 0, negativeSegments: 15
2848: positiveSegments: 0, negativeSegments: 9
2849: positiveSegments: 8, negativeSegments: 3
2850: positiveSegments: 4, negativeSegments: 6
2851: positiveSegments: 20, negativeSegments: 3
2854: positiveSegments: 4, negativeSegments: 3
2858: positiveSegments: 19, negativeSegments: 3
2859: positiveSegments: 3, negativeSegments: 0
2860: positiveSegments: 4, negativeSegments: 18
2861: positiveSegments: 6, negativeSegments: 0
2863: positiveSegments: 0, negativeSegments: 6
2864: positiveSegments: 4, negativeSegments: 0
2868: positiveSegments: 4, negativeSegments: 3
2871: positiveSegments: 0, negativeSegments: 6
2872: positiveSegments: 8, negativeSegments: 6
2876: positiveSegments: 11, negativeSegments: 9
2883: positiveSegments: 4, negativeSegments: 3
2888: positiveSegments: 0, negativeSegments: 6
2889: positiveSegments: 7, negativeSegments: 0
2890: positiveSegments: 6, negativeSegments: 0
2891: positiveSegments: 6, negativeSegments: 0
2895: positiveSegments: 0, negativeSegments: 3
2903: positiveSegments: 3, negativeSegments: 0
2905: positiveSegments: 4, negativeSegments: 9
2910: positiveSegments: 0, negativeSegments: 3
2911: positiveSegments: 14, negativeSegments: 12
2914: positiveSegments: 0, negativeSegments: 9
2922: positiveSegments: 4, negativeSegments: 0
2924: positiveSegments: 7, negativeSegments: 0
2929: positiveSegments: 10, negativeSegments: 0
2930: positiveSegments: 0, negativeSegments: 6
2931: positiveSegments: 0, negativeSegments: 3
2935: positiveSegments: 16, negativeSegments: 0
2939: positiveSegments: 0, negativeSegments: 15
2940: positiveSegments: 10, negativeSegments: 6
2943: positiveSegments: 0, negativeSegments: 6
2944: positiveSegments: 0, negativeSegments: 6
2945: positiveSegments: 22, negativeSegments: 9
2947: positiveSegments: 23, negativeSegments: 0
2949: positiveSegments: 2, negativeSegments: 0
2952: positiveSegments: 0, negativeSegments: 21
2954: positiveSegments: 0, negativeSegments: 12
2955: positiveSegments: 4, negativeSegments: 3
2956: positiveSegments: 0, negativeSegments: 9
2958: positiveSegments: 0, negativeSegments: 3
2959: positiveSegments: 8, negativeSegments: 12
2960: positiveSegments: 18, negativeSegments: 0
2961: positiveSegments: 12, negativeSegments: 9
2964: positiveSegments: 13, negativeSegments: 0
2966: positiveSegments: 3, negativeSegments: 3
2970: positiveSegments: 0, negativeSegments: 12
2971: positiveSegments: 0, negativeSegments: 6
2972: positiveSegments: 8, negativeSegments: 0
2974: positiveSegments: 4, negativeSegments: 3
2975: positiveSegments: 2, negativeSegments: 9
2977: positiveSegments: 8, negativeSegments: 0
2980: positiveSegments: 0, negativeSegments: 3
2981: positiveSegments: 0, negativeSegments: 12
2982: positiveSegments: 8, negativeSegments: 9
2987: positiveSegments: 0, negativeSegments: 3
2991: positiveSegments: 12, negativeSegments: 3
2992: positiveSegments: 12, negativeSegments: 3
2993: positiveSegments: 16, negativeSegments: 3
2998: positiveSegments: 0, negativeSegments: 3
2999: positiveSegments: 7, negativeSegments: 0
3000: positiveSegments: 12, negativeSegments: 0
3001: positiveSegments: 8, negativeSegments: 3
3003: positiveSegments: 2, negativeSegments: 0
3004: positiveSegments: 0, negativeSegments: 9
3006: positiveSegments: 6, negativeSegments: 0
3009: positiveSegments: 11, negativeSegments: 3
count processed: 1300, current case index: 3014
3014: positiveSegments: 12, negativeSegments: 6
3019: positiveSegments: 27, negativeSegments: 0
3020: positiveSegments: 4, negativeSegments: 0
3023: positiveSegments: 6, negativeSegments: 3
3024: positiveSegments: 23, negativeSegments: 3
3027: positiveSegments: 4, negativeSegments: 0
3028: positiveSegments: 14, negativeSegments: 3
3030: positiveSegments: 0, negativeSegments: 6
3034: positiveSegments: 6, negativeSegments: 0
3035: positiveSegments: 8, negativeSegments: 6
3037: positiveSegments: 14, negativeSegments: 12
3039: positiveSegments: 10, negativeSegments: 0
3040: positiveSegments: 8, negativeSegments: 6
3042: positiveSegments: 7, negativeSegments: 3
3044: positiveSegments: 7, negativeSegments: 3
3047: positiveSegments: 10, negativeSegments: 0
3048: positiveSegments: 0, negativeSegments: 9
3050: positiveSegments: 4, negativeSegments: 0
3051: positiveSegments: 9, negativeSegments: 9
3057: positiveSegments: 0, negativeSegments: 3
3058: positiveSegments: 26, negativeSegments: 0
3059: positiveSegments: 0, negativeSegments: 9
3060: positiveSegments: 0, negativeSegments: 6
3070: positiveSegments: 0, negativeSegments: 3
3073: positiveSegments: 0, negativeSegments: 6
3074: positiveSegments: 21, negativeSegments: 6
3075: positiveSegments: 4, negativeSegments: 6
3079: positiveSegments: 8, negativeSegments: 3
3082: positiveSegments: 4, negativeSegments: 6
3085: positiveSegments: 3, negativeSegments: 0
3087: positiveSegments: 8, negativeSegments: 0
3088: positiveSegments: 3, negativeSegments: 0
3090: positiveSegments: 0, negativeSegments: 6
3091: positiveSegments: 3, negativeSegments: 0
3092: positiveSegments: 0, negativeSegments: 6
3093: positiveSegments: 8, negativeSegments: 15
3094: positiveSegments: 4, negativeSegments: 6
3095: positiveSegments: 4, negativeSegments: 0
3096: positiveSegments: 4, negativeSegments: 3
3097: positiveSegments: 6, negativeSegments: 0
3098: positiveSegments: 0, negativeSegments: 6
3101: positiveSegments: 7, negativeSegments: 0
3102: positiveSegments: 0, negativeSegments: 3
3106: positiveSegments: 0, negativeSegments: 6
3107: positiveSegments: 5, negativeSegments: 3
3108: positiveSegments: 11, negativeSegments: 3
3110: positiveSegments: 0, negativeSegments: 9
3112: exit early, no segments to save
3112: positiveSegments: 0, negativeSegments: 0
3113: positiveSegments: 21, negativeSegments: 6
3116: positiveSegments: 8, negativeSegments: 0
3118: positiveSegments: 0, negativeSegments: 12
3120: positiveSegments: 10, negativeSegments: 0
3121: positiveSegments: 0, negativeSegments: 15
3122: positiveSegments: 0, negativeSegments: 3
3125: positiveSegments: 0, negativeSegments: 6
3126: positiveSegments: 5, negativeSegments: 0
3130: positiveSegments: 0, negativeSegments: 9
3133: positiveSegments: 12, negativeSegments: 3
3134: positiveSegments: 12, negativeSegments: 21
3135: positiveSegments: 0, negativeSegments: 3
3136: positiveSegments: 10, negativeSegments: 0
3137: positiveSegments: 0, negativeSegments: 12
3138: positiveSegments: 17, negativeSegments: 0
3141: positiveSegments: 4, negativeSegments: 3
3142: positiveSegments: 12, negativeSegments: 0
3145: positiveSegments: 16, negativeSegments: 9
3146: positiveSegments: 28, negativeSegments: 3
3147: positiveSegments: 11, negativeSegments: 6
3148: positiveSegments: 28, negativeSegments: 12
3149: positiveSegments: 0, negativeSegments: 12
3150: positiveSegments: 0, negativeSegments: 9
3153: positiveSegments: 4, negativeSegments: 0
3154: positiveSegments: 10, negativeSegments: 6
3155: positiveSegments: 10, negativeSegments: 9
3157: positiveSegments: 12, negativeSegments: 0
3159: positiveSegments: 0, negativeSegments: 6
3161: positiveSegments: 11, negativeSegments: 0
3164: positiveSegments: 4, negativeSegments: 12
3165: positiveSegments: 23, negativeSegments: 0
3167: positiveSegments: 0, negativeSegments: 12
3169: positiveSegments: 4, negativeSegments: 0
3170: positiveSegments: 0, negativeSegments: 3
3172: positiveSegments: 0, negativeSegments: 3
3173: positiveSegments: 14, negativeSegments: 3
3175: positiveSegments: 0, negativeSegments: 3
3180: positiveSegments: 4, negativeSegments: 3
3181: positiveSegments: 0, negativeSegments: 12
3184: positiveSegments: 0, negativeSegments: 3
3186: positiveSegments: 4, negativeSegments: 9
3188: positiveSegments: 31, negativeSegments: 0
3189: positiveSegments: 0, negativeSegments: 12
3193: positiveSegments: 14, negativeSegments: 21
3194: positiveSegments: 0, negativeSegments: 18
3196: positiveSegments: 0, negativeSegments: 9
3198: positiveSegments: 2, negativeSegments: 0
3202: positiveSegments: 3, negativeSegments: 0
3203: positiveSegments: 4, negativeSegments: 12
3208: positiveSegments: 0, negativeSegments: 6
3211: positiveSegments: 0, negativeSegments: 6
3216: positiveSegments: 5, negativeSegments: 3
count processed: 1400, current case index: 3218
3218: positiveSegments: 0, negativeSegments: 3
3221: positiveSegments: 0, negativeSegments: 3
3222: positiveSegments: 23, negativeSegments: 0
3223: positiveSegments: 0, negativeSegments: 12
3228: positiveSegments: 25, negativeSegments: 6
3229: positiveSegments: 0, negativeSegments: 6
3231: positiveSegments: 4, negativeSegments: 6
3232: positiveSegments: 0, negativeSegments: 9
3233: positiveSegments: 16, negativeSegments: 3
3235: positiveSegments: 7, negativeSegments: 0
3240: positiveSegments: 0, negativeSegments: 6
3243: positiveSegments: 0, negativeSegments: 6
3247: positiveSegments: 27, negativeSegments: 6
3248: positiveSegments: 8, negativeSegments: 0
3250: positiveSegments: 8, negativeSegments: 6
3255: positiveSegments: 24, negativeSegments: 0
3256: positiveSegments: 4, negativeSegments: 6
3260: positiveSegments: 3, negativeSegments: 3
3263: positiveSegments: 4, negativeSegments: 6
3267: positiveSegments: 0, negativeSegments: 18
3270: positiveSegments: 39, negativeSegments: 0
3271: positiveSegments: 8, negativeSegments: 6
3273: positiveSegments: 21, negativeSegments: 0
3275: positiveSegments: 3, negativeSegments: 3
3276: positiveSegments: 4, negativeSegments: 0
3279: positiveSegments: 19, negativeSegments: 3
3284: positiveSegments: 16, negativeSegments: 6
3286: positiveSegments: 18, negativeSegments: 0
3287: positiveSegments: 0, negativeSegments: 12
3291: positiveSegments: 20, negativeSegments: 6
3293: positiveSegments: 29, negativeSegments: 6
3295: positiveSegments: 8, negativeSegments: 0
3297: positiveSegments: 18, negativeSegments: 3
3299: positiveSegments: 0, negativeSegments: 3
3304: positiveSegments: 4, negativeSegments: 3
3307: positiveSegments: 10, negativeSegments: 6
3309: positiveSegments: 0, negativeSegments: 6
3310: positiveSegments: 24, negativeSegments: 0
3311: positiveSegments: 18, negativeSegments: 12
3312: positiveSegments: 0, negativeSegments: 3
3315: positiveSegments: 0, negativeSegments: 6
3317: positiveSegments: 4, negativeSegments: 6
3318: positiveSegments: 6, negativeSegments: 9
3321: positiveSegments: 4, negativeSegments: 6
3325: positiveSegments: 6, negativeSegments: 6
3327: positiveSegments: 0, negativeSegments: 3
3328: positiveSegments: 6, negativeSegments: 3
3329: positiveSegments: 6, negativeSegments: 0
3330: positiveSegments: 4, negativeSegments: 12
3331: positiveSegments: 12, negativeSegments: 3
3333: positiveSegments: 3, negativeSegments: 3
3334: positiveSegments: 23, negativeSegments: 9
3336: positiveSegments: 6, negativeSegments: 0
3338: positiveSegments: 10, negativeSegments: 3
3340: positiveSegments: 10, negativeSegments: 3
3342: positiveSegments: 0, negativeSegments: 9
3344: positiveSegments: 4, negativeSegments: 3
3345: positiveSegments: 22, negativeSegments: 3
3348: positiveSegments: 20, negativeSegments: 6
3352: positiveSegments: 14, negativeSegments: 6
3355: positiveSegments: 2, negativeSegments: 0
3357: positiveSegments: 2, negativeSegments: 0
3358: positiveSegments: 8, negativeSegments: 9
3359: positiveSegments: 4, negativeSegments: 0
3361: positiveSegments: 0, negativeSegments: 18
3362: positiveSegments: 0, negativeSegments: 3
3363: positiveSegments: 0, negativeSegments: 12
3364: positiveSegments: 3, negativeSegments: 6
3367: positiveSegments: 0, negativeSegments: 3
3369: positiveSegments: 0, negativeSegments: 3
3373: positiveSegments: 25, negativeSegments: 6
3374: positiveSegments: 0, negativeSegments: 3
3375: positiveSegments: 0, negativeSegments: 3
3379: positiveSegments: 0, negativeSegments: 6
3380: positiveSegments: 0, negativeSegments: 9
3381: positiveSegments: 0, negativeSegments: 18
3382: positiveSegments: 11, negativeSegments: 3
3383: positiveSegments: 4, negativeSegments: 6
3385: positiveSegments: 3, negativeSegments: 9
3389: positiveSegments: 4, negativeSegments: 3
3394: positiveSegments: 0, negativeSegments: 12
3396: positiveSegments: 3, negativeSegments: 0
3398: positiveSegments: 0, negativeSegments: 3
3399: positiveSegments: 0, negativeSegments: 3
3401: positiveSegments: 8, negativeSegments: 3
3404: positiveSegments: 4, negativeSegments: 3
3409: positiveSegments: 12, negativeSegments: 3
3412: positiveSegments: 3, negativeSegments: 0
3414: positiveSegments: 0, negativeSegments: 3
3415: positiveSegments: 21, negativeSegments: 3
3418: positiveSegments: 0, negativeSegments: 9
3422: positiveSegments: 0, negativeSegments: 3
3427: positiveSegments: 4, negativeSegments: 0
3428: positiveSegments: 7, negativeSegments: 3
3429: positiveSegments: 11, negativeSegments: 6
3431: positiveSegments: 0, negativeSegments: 6
3435: positiveSegments: 0, negativeSegments: 3
3436: positiveSegments: 0, negativeSegments: 6
3440: positiveSegments: 0, negativeSegments: 6
3441: positiveSegments: 0, negativeSegments: 3
count processed: 1500, current case index: 3442
3442: positiveSegments: 14, negativeSegments: 3
3447: positiveSegments: 0, negativeSegments: 3
3449: positiveSegments: 3, negativeSegments: 3
3450: positiveSegments: 0, negativeSegments: 6
3451: positiveSegments: 0, negativeSegments: 12
3453: positiveSegments: 0, negativeSegments: 6
3457: positiveSegments: 9, negativeSegments: 0
3458: positiveSegments: 4, negativeSegments: 0
3460: positiveSegments: 10, negativeSegments: 0
3462: positiveSegments: 7, negativeSegments: 6
3463: positiveSegments: 0, negativeSegments: 9
3464: positiveSegments: 0, negativeSegments: 6
3468: positiveSegments: 0, negativeSegments: 9
3470: positiveSegments: 8, negativeSegments: 0
3472: positiveSegments: 2, negativeSegments: 0
3475: positiveSegments: 0, negativeSegments: 3
3478: positiveSegments: 3, negativeSegments: 3
3479: positiveSegments: 0, negativeSegments: 3
3481: positiveSegments: 14, negativeSegments: 0
3483: positiveSegments: 20, negativeSegments: 15
3488: positiveSegments: 0, negativeSegments: 6
3492: positiveSegments: 4, negativeSegments: 0
3499: positiveSegments: 0, negativeSegments: 15
3501: positiveSegments: 4, negativeSegments: 3
3502: positiveSegments: 5, negativeSegments: 9
3503: positiveSegments: 0, negativeSegments: 6
3505: positiveSegments: 0, negativeSegments: 6
3506: positiveSegments: 14, negativeSegments: 0
3513: positiveSegments: 0, negativeSegments: 15
3514: positiveSegments: 6, negativeSegments: 0
3515: positiveSegments: 4, negativeSegments: 6
3516: positiveSegments: 6, negativeSegments: 9
3517: positiveSegments: 3, negativeSegments: 0
3521: positiveSegments: 35, negativeSegments: 6
3524: positiveSegments: 26, negativeSegments: 0
3526: positiveSegments: 3, negativeSegments: 3
3527: positiveSegments: 0, negativeSegments: 6
3528: positiveSegments: 0, negativeSegments: 3
3532: positiveSegments: 0, negativeSegments: 3
3535: positiveSegments: 14, negativeSegments: 0
3537: positiveSegments: 0, negativeSegments: 9
3544: positiveSegments: 0, negativeSegments: 6
3546: positiveSegments: 16, negativeSegments: 6
3549: positiveSegments: 18, negativeSegments: 0
3550: positiveSegments: 20, negativeSegments: 3
3555: positiveSegments: 0, negativeSegments: 6
3558: positiveSegments: 0, negativeSegments: 12
3559: positiveSegments: 17, negativeSegments: 9
3560: positiveSegments: 0, negativeSegments: 3
3562: positiveSegments: 0, negativeSegments: 3
3564: positiveSegments: 0, negativeSegments: 3
3565: positiveSegments: 12, negativeSegments: 0
3566: positiveSegments: 26, negativeSegments: 3
3567: positiveSegments: 0, negativeSegments: 6
3568: positiveSegments: 4, negativeSegments: 3
3569: positiveSegments: 6, negativeSegments: 0
3570: positiveSegments: 4, negativeSegments: 0
3571: positiveSegments: 4, negativeSegments: 15
3572: positiveSegments: 24, negativeSegments: 6
3573: positiveSegments: 8, negativeSegments: 3
3576: positiveSegments: 9, negativeSegments: 6
3581: positiveSegments: 0, negativeSegments: 3
3582: positiveSegments: 4, negativeSegments: 6
3585: positiveSegments: 3, negativeSegments: 0
3588: positiveSegments: 26, negativeSegments: 0
3589: positiveSegments: 0, negativeSegments: 3
3593: positiveSegments: 0, negativeSegments: 9
3594: positiveSegments: 0, negativeSegments: 18
3596: exit early, no segments to save
3596: positiveSegments: 0, negativeSegments: 0
3602: positiveSegments: 5, negativeSegments: 6
3603: positiveSegments: 4, negativeSegments: 6
3606: positiveSegments: 0, negativeSegments: 12
3607: positiveSegments: 14, negativeSegments: 3
3608: positiveSegments: 4, negativeSegments: 3
3609: positiveSegments: 5, negativeSegments: 0
3611: positiveSegments: 0, negativeSegments: 3
3614: positiveSegments: 6, negativeSegments: 0
3616: positiveSegments: 18, negativeSegments: 0
3618: positiveSegments: 0, negativeSegments: 9
3620: positiveSegments: 4, negativeSegments: 3
3621: positiveSegments: 17, negativeSegments: 6
3623: positiveSegments: 12, negativeSegments: 0
3625: positiveSegments: 26, negativeSegments: 12
3628: positiveSegments: 6, negativeSegments: 0
3629: positiveSegments: 0, negativeSegments: 3
3631: positiveSegments: 23, negativeSegments: 0
3642: positiveSegments: 0, negativeSegments: 6
3648: exit early, no segments to save
3648: positiveSegments: 0, negativeSegments: 0
3652: positiveSegments: 4, negativeSegments: 18
3656: positiveSegments: 0, negativeSegments: 21
3658: positiveSegments: 0, negativeSegments: 9
3660: positiveSegments: 7, negativeSegments: 0
3662: positiveSegments: 0, negativeSegments: 6
3663: positiveSegments: 0, negativeSegments: 9
3668: positiveSegments: 0, negativeSegments: 15
3669: positiveSegments: 12, negativeSegments: 0
3672: positiveSegments: 4, negativeSegments: 0
3674: positiveSegments: 0, negativeSegments: 15
3677: positiveSegments: 0, negativeSegments: 6
3678: positiveSegments: 4, negativeSegments: 6
count processed: 1600, current case index: 3682
3682: positiveSegments: 7, negativeSegments: 0
3686: positiveSegments: 17, negativeSegments: 6
3687: positiveSegments: 7, negativeSegments: 9
3688: positiveSegments: 2, negativeSegments: 0
3689: positiveSegments: 12, negativeSegments: 3
3690: positiveSegments: 4, negativeSegments: 3
3691: positiveSegments: 0, negativeSegments: 3
3694: positiveSegments: 12, negativeSegments: 9
3697: positiveSegments: 0, negativeSegments: 9
3699: positiveSegments: 7, negativeSegments: 6
3700: positiveSegments: 14, negativeSegments: 3
3702: positiveSegments: 4, negativeSegments: 9
3703: positiveSegments: 0, negativeSegments: 9
3704: positiveSegments: 3, negativeSegments: 6
3706: positiveSegments: 0, negativeSegments: 18
3710: positiveSegments: 4, negativeSegments: 9
3711: positiveSegments: 17, negativeSegments: 6
3712: positiveSegments: 4, negativeSegments: 6
3713: positiveSegments: 16, negativeSegments: 0
3719: positiveSegments: 6, negativeSegments: 3
3722: positiveSegments: 12, negativeSegments: 6
3723: positiveSegments: 0, negativeSegments: 3
3724: positiveSegments: 3, negativeSegments: 0
3725: positiveSegments: 15, negativeSegments: 3
3727: positiveSegments: 0, negativeSegments: 3
3728: positiveSegments: 10, negativeSegments: 0
3729: positiveSegments: 0, negativeSegments: 9
3730: positiveSegments: 0, negativeSegments: 3
3732: positiveSegments: 0, negativeSegments: 3
3737: positiveSegments: 15, negativeSegments: 6
3739: positiveSegments: 5, negativeSegments: 0
3740: positiveSegments: 0, negativeSegments: 9
3743: positiveSegments: 0, negativeSegments: 9
3744: positiveSegments: 20, negativeSegments: 0
3748: positiveSegments: 2, negativeSegments: 3
3749: positiveSegments: 12, negativeSegments: 12
3750: positiveSegments: 0, negativeSegments: 9
3752: positiveSegments: 0, negativeSegments: 18
3753: positiveSegments: 0, negativeSegments: 24
3757: positiveSegments: 0, negativeSegments: 6
3758: positiveSegments: 5, negativeSegments: 3
3761: positiveSegments: 8, negativeSegments: 3
3763: positiveSegments: 0, negativeSegments: 6
3764: positiveSegments: 0, negativeSegments: 6
3768: positiveSegments: 4, negativeSegments: 9
3774: positiveSegments: 8, negativeSegments: 0
3775: positiveSegments: 12, negativeSegments: 6
3776: positiveSegments: 0, negativeSegments: 3
3777: positiveSegments: 6, negativeSegments: 3
3782: positiveSegments: 4, negativeSegments: 6
3783: positiveSegments: 6, negativeSegments: 0
3784: positiveSegments: 12, negativeSegments: 0
3785: positiveSegments: 18, negativeSegments: 9
3789: positiveSegments: 0, negativeSegments: 6
3791: positiveSegments: 0, negativeSegments: 12
3793: positiveSegments: 0, negativeSegments: 3
3798: positiveSegments: 4, negativeSegments: 15
3799: positiveSegments: 4, negativeSegments: 0
3800: positiveSegments: 21, negativeSegments: 3
3802: positiveSegments: 4, negativeSegments: 0
3803: positiveSegments: 8, negativeSegments: 0
3805: positiveSegments: 7, negativeSegments: 6
3810: positiveSegments: 0, negativeSegments: 3
3812: positiveSegments: 6, negativeSegments: 0
3813: positiveSegments: 4, negativeSegments: 6
3814: positiveSegments: 0, negativeSegments: 6
3816: positiveSegments: 2, negativeSegments: 3
3817: positiveSegments: 2, negativeSegments: 6
3818: positiveSegments: 0, negativeSegments: 15
3819: positiveSegments: 4, negativeSegments: 3
3822: positiveSegments: 13, negativeSegments: 3
3823: positiveSegments: 0, negativeSegments: 9
3824: positiveSegments: 27, negativeSegments: 6
3825: positiveSegments: 8, negativeSegments: 0
3828: positiveSegments: 8, negativeSegments: 3
3831: positiveSegments: 4, negativeSegments: 9
3832: positiveSegments: 0, negativeSegments: 18
3835: positiveSegments: 0, negativeSegments: 9
3836: positiveSegments: 15, negativeSegments: 6
3837: positiveSegments: 4, negativeSegments: 9
3839: positiveSegments: 4, negativeSegments: 0
3840: positiveSegments: 6, negativeSegments: 3
3842: positiveSegments: 12, negativeSegments: 18
3843: positiveSegments: 16, negativeSegments: 6
3844: positiveSegments: 0, negativeSegments: 18
3845: positiveSegments: 4, negativeSegments: 6
3846: positiveSegments: 12, negativeSegments: 9
3848: positiveSegments: 0, negativeSegments: 3
3849: positiveSegments: 1, negativeSegments: 3
3850: positiveSegments: 0, negativeSegments: 15
3854: positiveSegments: 37, negativeSegments: 6
3855: positiveSegments: 0, negativeSegments: 6
3857: positiveSegments: 9, negativeSegments: 3
3859: positiveSegments: 1, negativeSegments: 0
3863: positiveSegments: 8, negativeSegments: 15
3864: positiveSegments: 8, negativeSegments: 3
3868: exit early, no segments to save
3868: positiveSegments: 0, negativeSegments: 0
3870: positiveSegments: 15, negativeSegments: 9
3877: positiveSegments: 5, negativeSegments: 6
3878: positiveSegments: 16, negativeSegments: 6
count processed: 1700, current case index: 3879
3879: positiveSegments: 16, negativeSegments: 0
3886: positiveSegments: 0, negativeSegments: 3
3887: positiveSegments: 0, negativeSegments: 3
3888: positiveSegments: 0, negativeSegments: 6
3889: positiveSegments: 0, negativeSegments: 12
3890: positiveSegments: 8, negativeSegments: 0
3891: positiveSegments: 0, negativeSegments: 6
3893: positiveSegments: 7, negativeSegments: 3
3894: positiveSegments: 0, negativeSegments: 3
3895: positiveSegments: 8, negativeSegments: 9
3898: positiveSegments: 8, negativeSegments: 0
3902: positiveSegments: 4, negativeSegments: 6
3904: positiveSegments: 8, negativeSegments: 0
3906: positiveSegments: 4, negativeSegments: 3
3910: positiveSegments: 0, negativeSegments: 3
3912: positiveSegments: 0, negativeSegments: 3
3913: positiveSegments: 14, negativeSegments: 6
3919: positiveSegments: 11, negativeSegments: 3
3925: positiveSegments: 0, negativeSegments: 9
3928: positiveSegments: 4, negativeSegments: 0
3929: positiveSegments: 10, negativeSegments: 3
3930: positiveSegments: 17, negativeSegments: 3
3931: positiveSegments: 9, negativeSegments: 3
3934: positiveSegments: 0, negativeSegments: 3
3935: positiveSegments: 4, negativeSegments: 0
3936: positiveSegments: 8, negativeSegments: 0
3937: positiveSegments: 3, negativeSegments: 0
3938: positiveSegments: 4, negativeSegments: 3
3944: positiveSegments: 5, negativeSegments: 15
3949: positiveSegments: 8, negativeSegments: 6
3950: positiveSegments: 4, negativeSegments: 15
3955: positiveSegments: 16, negativeSegments: 0
3958: positiveSegments: 4, negativeSegments: 6
3962: positiveSegments: 8, negativeSegments: 0
3963: positiveSegments: 0, negativeSegments: 12
3967: positiveSegments: 0, negativeSegments: 9
3968: positiveSegments: 7, negativeSegments: 3
3972: positiveSegments: 0, negativeSegments: 3
3973: positiveSegments: 0, negativeSegments: 12
3974: positiveSegments: 0, negativeSegments: 12
3975: positiveSegments: 4, negativeSegments: 9
3976: positiveSegments: 0, negativeSegments: 18
3978: positiveSegments: 4, negativeSegments: 0
3981: positiveSegments: 4, negativeSegments: 0
3986: positiveSegments: 0, negativeSegments: 3
3987: positiveSegments: 0, negativeSegments: 6
3988: positiveSegments: 8, negativeSegments: 3
3991: positiveSegments: 0, negativeSegments: 15
3993: positiveSegments: 11, negativeSegments: 9
3994: positiveSegments: 0, negativeSegments: 24
3998: positiveSegments: 0, negativeSegments: 18
3999: positiveSegments: 0, negativeSegments: 3
4000: positiveSegments: 5, negativeSegments: 0
4004: positiveSegments: 0, negativeSegments: 6
4005: positiveSegments: 3, negativeSegments: 9
4007: positiveSegments: 4, negativeSegments: 0
4009: positiveSegments: 1, negativeSegments: 0
4010: positiveSegments: 4, negativeSegments: 3
4011: positiveSegments: 0, negativeSegments: 9
4012: positiveSegments: 3, negativeSegments: 3
4013: positiveSegments: 13, negativeSegments: 9
4016: positiveSegments: 2, negativeSegments: 0
4017: positiveSegments: 1, negativeSegments: 3
4020: positiveSegments: 0, negativeSegments: 6
4022: positiveSegments: 3, negativeSegments: 0
4024: positiveSegments: 5, negativeSegments: 0
4026: positiveSegments: 0, negativeSegments: 18
4027: positiveSegments: 0, negativeSegments: 6
4028: positiveSegments: 0, negativeSegments: 3
4030: positiveSegments: 0, negativeSegments: 9
4032: positiveSegments: 4, negativeSegments: 9
4033: positiveSegments: 15, negativeSegments: 0
4034: positiveSegments: 9, negativeSegments: 0
4035: positiveSegments: 4, negativeSegments: 6
4036: positiveSegments: 8, negativeSegments: 9
4037: positiveSegments: 0, negativeSegments: 3
4040: positiveSegments: 12, negativeSegments: 3
4042: positiveSegments: 0, negativeSegments: 9
4043: positiveSegments: 24, negativeSegments: 9
4045: positiveSegments: 12, negativeSegments: 0
4046: positiveSegments: 0, negativeSegments: 3
4048: positiveSegments: 5, negativeSegments: 0
4050: positiveSegments: 0, negativeSegments: 15
4054: positiveSegments: 15, negativeSegments: 6
4060: positiveSegments: 4, negativeSegments: 15
4066: positiveSegments: 17, negativeSegments: 3
4067: positiveSegments: 0, negativeSegments: 3
4069: positiveSegments: 0, negativeSegments: 3
4070: positiveSegments: 15, negativeSegments: 0
4072: positiveSegments: 13, negativeSegments: 0
4073: positiveSegments: 0, negativeSegments: 3
4074: positiveSegments: 0, negativeSegments: 15
4083: positiveSegments: 0, negativeSegments: 3
4091: positiveSegments: 0, negativeSegments: 15
4093: positiveSegments: 24, negativeSegments: 6
4098: positiveSegments: 4, negativeSegments: 15
4100: positiveSegments: 0, negativeSegments: 3
4101: positiveSegments: 4, negativeSegments: 3
4106: positiveSegments: 4, negativeSegments: 3
4107: positiveSegments: 3, negativeSegments: 0
count processed: 1800, current case index: 4109
4109: positiveSegments: 6, negativeSegments: 0
4112: positiveSegments: 4, negativeSegments: 0
4114: positiveSegments: 20, negativeSegments: 6
4115: positiveSegments: 4, negativeSegments: 12
4116: positiveSegments: 4, negativeSegments: 3
4120: positiveSegments: 0, negativeSegments: 3
4127: positiveSegments: 8, negativeSegments: 9
4133: positiveSegments: 0, negativeSegments: 6
4137: positiveSegments: 18, negativeSegments: 3
4140: positiveSegments: 36, negativeSegments: 6
4143: positiveSegments: 22, negativeSegments: 12
4144: positiveSegments: 20, negativeSegments: 0
4146: positiveSegments: 0, negativeSegments: 18
4148: positiveSegments: 0, negativeSegments: 6
4149: positiveSegments: 9, negativeSegments: 0
4150: positiveSegments: 4, negativeSegments: 0
4152: positiveSegments: 4, negativeSegments: 0
4155: positiveSegments: 4, negativeSegments: 3
4162: positiveSegments: 8, negativeSegments: 9
4166: positiveSegments: 22, negativeSegments: 21
4167: positiveSegments: 21, negativeSegments: 0
4168: positiveSegments: 3, negativeSegments: 0
4172: positiveSegments: 4, negativeSegments: 3
4173: positiveSegments: 0, negativeSegments: 3
4177: positiveSegments: 4, negativeSegments: 0
4179: positiveSegments: 21, negativeSegments: 0
4181: positiveSegments: 0, negativeSegments: 3
4186: positiveSegments: 4, negativeSegments: 3
4189: positiveSegments: 4, negativeSegments: 6
4191: positiveSegments: 2, negativeSegments: 15
4195: positiveSegments: 0, negativeSegments: 6
4202: positiveSegments: 4, negativeSegments: 9
4206: positiveSegments: 8, negativeSegments: 0
4207: positiveSegments: 4, negativeSegments: 18
4208: positiveSegments: 7, negativeSegments: 0
4210: positiveSegments: 6, negativeSegments: 3
4211: positiveSegments: 11, negativeSegments: 3
4212: positiveSegments: 0, negativeSegments: 12
4213: positiveSegments: 0, negativeSegments: 3
4214: positiveSegments: 0, negativeSegments: 12
4216: positiveSegments: 0, negativeSegments: 15
4222: positiveSegments: 0, negativeSegments: 9
4223: positiveSegments: 0, negativeSegments: 3
4225: positiveSegments: 17, negativeSegments: 12
4227: positiveSegments: 4, negativeSegments: 3
4233: positiveSegments: 0, negativeSegments: 9
4236: positiveSegments: 8, negativeSegments: 3
4238: positiveSegments: 4, negativeSegments: 9
4240: positiveSegments: 18, negativeSegments: 0
4241: positiveSegments: 0, negativeSegments: 3
4242: positiveSegments: 4, negativeSegments: 3
4245: positiveSegments: 18, negativeSegments: 0
4247: positiveSegments: 0, negativeSegments: 6
4249: positiveSegments: 14, negativeSegments: 3
4251: positiveSegments: 19, negativeSegments: 0
4252: positiveSegments: 0, negativeSegments: 9
4254: positiveSegments: 4, negativeSegments: 6
4255: positiveSegments: 6, negativeSegments: 3
4256: positiveSegments: 43, negativeSegments: 6
4258: positiveSegments: 12, negativeSegments: 3
4259: positiveSegments: 0, negativeSegments: 6
4264: positiveSegments: 8, negativeSegments: 6
4265: positiveSegments: 13, negativeSegments: 0
4269: positiveSegments: 14, negativeSegments: 6
4272: positiveSegments: 17, negativeSegments: 3
4278: positiveSegments: 12, negativeSegments: 3
4279: positiveSegments: 4, negativeSegments: 3
4280: positiveSegments: 8, negativeSegments: 3
4281: positiveSegments: 0, negativeSegments: 3
4282: positiveSegments: 12, negativeSegments: 3
4283: positiveSegments: 16, negativeSegments: 0
4284: positiveSegments: 4, negativeSegments: 9
4286: positiveSegments: 4, negativeSegments: 0
4287: positiveSegments: 0, negativeSegments: 9
4289: positiveSegments: 27, negativeSegments: 6
4292: positiveSegments: 4, negativeSegments: 15
4293: positiveSegments: 2, negativeSegments: 0
4294: positiveSegments: 0, negativeSegments: 9
4296: positiveSegments: 3, negativeSegments: 6
4302: positiveSegments: 16, negativeSegments: 6
4304: positiveSegments: 28, negativeSegments: 0
4305: positiveSegments: 9, negativeSegments: 0
4307: positiveSegments: 4, negativeSegments: 3
4308: positiveSegments: 0, negativeSegments: 3
4309: positiveSegments: 3, negativeSegments: 3
4310: positiveSegments: 4, negativeSegments: 12
4314: positiveSegments: 0, negativeSegments: 9
4316: positiveSegments: 9, negativeSegments: 3
4317: positiveSegments: 10, negativeSegments: 0
4320: positiveSegments: 6, negativeSegments: 6
4322: positiveSegments: 4, negativeSegments: 9
4325: positiveSegments: 0, negativeSegments: 3
4326: positiveSegments: 8, negativeSegments: 3
4327: positiveSegments: 11, negativeSegments: 0
4332: positiveSegments: 5, negativeSegments: 0
4333: positiveSegments: 12, negativeSegments: 6
4335: positiveSegments: 0, negativeSegments: 15
4339: positiveSegments: 4, negativeSegments: 0
4341: positiveSegments: 4, negativeSegments: 12
4345: positiveSegments: 11, negativeSegments: 0
count processed: 1900, current case index: 4347
4347: positiveSegments: 15, negativeSegments: 3
4350: positiveSegments: 0, negativeSegments: 3
4354: positiveSegments: 7, negativeSegments: 6
4356: positiveSegments: 0, negativeSegments: 9
4364: positiveSegments: 4, negativeSegments: 3
4367: positiveSegments: 0, negativeSegments: 9
4368: positiveSegments: 4, negativeSegments: 6
4371: positiveSegments: 0, negativeSegments: 3
4375: positiveSegments: 0, negativeSegments: 15
4377: positiveSegments: 8, negativeSegments: 15
4380: positiveSegments: 4, negativeSegments: 0
4382: positiveSegments: 5, negativeSegments: 3
4383: positiveSegments: 8, negativeSegments: 6
4385: positiveSegments: 0, negativeSegments: 3
4388: positiveSegments: 1, negativeSegments: 0
4389: positiveSegments: 0, negativeSegments: 3
4390: positiveSegments: 11, negativeSegments: 6
4392: positiveSegments: 8, negativeSegments: 3
4396: positiveSegments: 0, negativeSegments: 3
4398: positiveSegments: 14, negativeSegments: 0
4400: positiveSegments: 4, negativeSegments: 6
4401: positiveSegments: 0, negativeSegments: 6
4402: positiveSegments: 4, negativeSegments: 12
4405: positiveSegments: 15, negativeSegments: 0
4406: positiveSegments: 3, negativeSegments: 0
4408: positiveSegments: 4, negativeSegments: 0
4409: positiveSegments: 4, negativeSegments: 9
4411: positiveSegments: 4, negativeSegments: 0
4417: positiveSegments: 4, negativeSegments: 3
4424: positiveSegments: 0, negativeSegments: 12
4426: positiveSegments: 0, negativeSegments: 6
4429: positiveSegments: 14, negativeSegments: 6
4430: positiveSegments: 0, negativeSegments: 9
4432: positiveSegments: 0, negativeSegments: 6
4437: positiveSegments: 7, negativeSegments: 3
4439: positiveSegments: 12, negativeSegments: 0
4443: positiveSegments: 0, negativeSegments: 6
4449: positiveSegments: 0, negativeSegments: 6
4451: positiveSegments: 10, negativeSegments: 3
4453: positiveSegments: 8, negativeSegments: 0
4456: positiveSegments: 4, negativeSegments: 12
4457: positiveSegments: 4, negativeSegments: 9
4458: positiveSegments: 6, negativeSegments: 0
4461: positiveSegments: 8, negativeSegments: 15
4462: positiveSegments: 11, negativeSegments: 0
4463: positiveSegments: 0, negativeSegments: 12
4470: positiveSegments: 0, negativeSegments: 6
4472: positiveSegments: 12, negativeSegments: 0
4474: positiveSegments: 0, negativeSegments: 9
4475: positiveSegments: 0, negativeSegments: 9
4476: positiveSegments: 11, negativeSegments: 3
4477: positiveSegments: 3, negativeSegments: 0
4478: positiveSegments: 0, negativeSegments: 12
4480: positiveSegments: 0, negativeSegments: 3
4481: positiveSegments: 25, negativeSegments: 9
4483: positiveSegments: 0, negativeSegments: 3
4485: exit early, no segments to save
4485: positiveSegments: 0, negativeSegments: 0
4489: positiveSegments: 0, negativeSegments: 6
4490: positiveSegments: 6, negativeSegments: 0
4496: positiveSegments: 15, negativeSegments: 9
4497: positiveSegments: 4, negativeSegments: 0
4498: positiveSegments: 0, negativeSegments: 6
4502: positiveSegments: 4, negativeSegments: 3
4503: positiveSegments: 0, negativeSegments: 3
4504: positiveSegments: 39, negativeSegments: 0
4509: positiveSegments: 4, negativeSegments: 9
4510: positiveSegments: 15, negativeSegments: 3
4515: positiveSegments: 1, negativeSegments: 0
4520: positiveSegments: 0, negativeSegments: 3
4522: positiveSegments: 0, negativeSegments: 3
4525: positiveSegments: 0, negativeSegments: 9
4530: positiveSegments: 0, negativeSegments: 6
4536: positiveSegments: 0, negativeSegments: 3
4538: positiveSegments: 2, negativeSegments: 3
4540: positiveSegments: 0, negativeSegments: 9
4541: positiveSegments: 0, negativeSegments: 3
4546: positiveSegments: 0, negativeSegments: 3
4547: positiveSegments: 0, negativeSegments: 6
4549: positiveSegments: 8, negativeSegments: 3
4550: positiveSegments: 0, negativeSegments: 9
4559: positiveSegments: 7, negativeSegments: 0
4561: positiveSegments: 4, negativeSegments: 0
4564: positiveSegments: 4, negativeSegments: 3
4568: positiveSegments: 12, negativeSegments: 6
4569: positiveSegments: 0, negativeSegments: 3
4572: positiveSegments: 0, negativeSegments: 3
4573: positiveSegments: 34, negativeSegments: 3
4574: positiveSegments: 28, negativeSegments: 12
4576: positiveSegments: 0, negativeSegments: 15
4577: positiveSegments: 0, negativeSegments: 6
4578: positiveSegments: 5, negativeSegments: 0
4581: positiveSegments: 3, negativeSegments: 3
4584: positiveSegments: 0, negativeSegments: 9
4589: positiveSegments: 4, negativeSegments: 9
4591: positiveSegments: 17, negativeSegments: 0
4596: positiveSegments: 0, negativeSegments: 9
4597: positiveSegments: 10, negativeSegments: 0
4599: positiveSegments: 10, negativeSegments: 0
4600: positiveSegments: 11, negativeSegments: 6
4602: positiveSegments: 6, negativeSegments: 0
count processed: 2000, current case index: 4603
4603: positiveSegments: 0, negativeSegments: 6
4604: positiveSegments: 15, negativeSegments: 12
4605: positiveSegments: 0, negativeSegments: 3
4607: positiveSegments: 7, negativeSegments: 0
4609: positiveSegments: 0, negativeSegments: 27
4612: positiveSegments: 6, negativeSegments: 6
4617: positiveSegments: 4, negativeSegments: 9
4618: positiveSegments: 10, negativeSegments: 6
4619: positiveSegments: 0, negativeSegments: 6
4620: positiveSegments: 4, negativeSegments: 0
4621: positiveSegments: 6, negativeSegments: 6
4626: positiveSegments: 8, negativeSegments: 3
4627: positiveSegments: 9, negativeSegments: 0
4631: positiveSegments: 0, negativeSegments: 3
4632: positiveSegments: 12, negativeSegments: 6
4635: positiveSegments: 4, negativeSegments: 0
4639: positiveSegments: 6, negativeSegments: 6
4644: positiveSegments: 12, negativeSegments: 15
4646: positiveSegments: 6, negativeSegments: 0
4652: positiveSegments: 19, negativeSegments: 6
4653: positiveSegments: 10, negativeSegments: 3
4654: positiveSegments: 3, negativeSegments: 9
4655: positiveSegments: 3, negativeSegments: 0
4656: positiveSegments: 9, negativeSegments: 0
4657: positiveSegments: 13, negativeSegments: 0
4658: positiveSegments: 1, negativeSegments: 6
4660: positiveSegments: 0, negativeSegments: 3
4662: positiveSegments: 12, negativeSegments: 0
4665: positiveSegments: 15, negativeSegments: 3
4666: positiveSegments: 4, negativeSegments: 15
4670: positiveSegments: 36, negativeSegments: 0
4678: positiveSegments: 0, negativeSegments: 6
4683: positiveSegments: 7, negativeSegments: 0
4684: positiveSegments: 20, negativeSegments: 9
4686: positiveSegments: 8, negativeSegments: 0
4695: positiveSegments: 3, negativeSegments: 0
4700: positiveSegments: 0, negativeSegments: 3
4705: positiveSegments: 3, negativeSegments: 0
4706: positiveSegments: 0, negativeSegments: 9
4711: positiveSegments: 6, negativeSegments: 3
4713: positiveSegments: 0, negativeSegments: 12
4715: positiveSegments: 13, negativeSegments: 9
4716: positiveSegments: 4, negativeSegments: 3
4717: positiveSegments: 0, negativeSegments: 6
4718: positiveSegments: 10, negativeSegments: 3
4721: positiveSegments: 20, negativeSegments: 0
4724: positiveSegments: 0, negativeSegments: 3
4726: positiveSegments: 3, negativeSegments: 3
4729: positiveSegments: 3, negativeSegments: 6
4731: positiveSegments: 0, negativeSegments: 9
4732: positiveSegments: 0, negativeSegments: 15
4739: positiveSegments: 11, negativeSegments: 0
4743: positiveSegments: 4, negativeSegments: 3
4744: positiveSegments: 6, negativeSegments: 0
4745: positiveSegments: 4, negativeSegments: 9
4746: positiveSegments: 0, negativeSegments: 6
4749: positiveSegments: 16, negativeSegments: 3
4750: positiveSegments: 0, negativeSegments: 12
4752: positiveSegments: 0, negativeSegments: 9
4755: positiveSegments: 4, negativeSegments: 0
4757: positiveSegments: 10, negativeSegments: 15
4759: positiveSegments: 17, negativeSegments: 9
4761: positiveSegments: 3, negativeSegments: 0
4763: positiveSegments: 3, negativeSegments: 0
4764: positiveSegments: 8, negativeSegments: 12
4767: positiveSegments: 2, negativeSegments: 3
4768: positiveSegments: 0, negativeSegments: 3
4769: positiveSegments: 0, negativeSegments: 3
4771: positiveSegments: 2, negativeSegments: 0
4773: positiveSegments: 4, negativeSegments: 3
4775: positiveSegments: 13, negativeSegments: 3
4777: positiveSegments: 10, negativeSegments: 0
4778: positiveSegments: 8, negativeSegments: 0
4779: positiveSegments: 10, negativeSegments: 0
4780: positiveSegments: 7, negativeSegments: 3
4781: positiveSegments: 0, negativeSegments: 21
4783: positiveSegments: 4, negativeSegments: 0
4785: positiveSegments: 0, negativeSegments: 6
4786: positiveSegments: 3, negativeSegments: 0
4788: positiveSegments: 12, negativeSegments: 3
4789: positiveSegments: 3, negativeSegments: 6
4790: positiveSegments: 0, negativeSegments: 3
4792: positiveSegments: 0, negativeSegments: 9
4798: positiveSegments: 0, negativeSegments: 9
4800: positiveSegments: 22, negativeSegments: 3
4801: positiveSegments: 25, negativeSegments: 9
4802: positiveSegments: 9, negativeSegments: 0
4803: positiveSegments: 0, negativeSegments: 3
4806: positiveSegments: 0, negativeSegments: 9
4808: positiveSegments: 4, negativeSegments: 3
4809: positiveSegments: 13, negativeSegments: 6
4813: positiveSegments: 4, negativeSegments: 3
4816: positiveSegments: 12, negativeSegments: 0
4817: positiveSegments: 35, negativeSegments: 3
4818: positiveSegments: 0, negativeSegments: 3
4820: positiveSegments: 14, negativeSegments: 0
4823: positiveSegments: 0, negativeSegments: 6
4825: positiveSegments: 10, negativeSegments: 0
4826: positiveSegments: 4, negativeSegments: 3
4828: positiveSegments: 19, negativeSegments: 0
count processed: 2100, current case index: 4830
4830: positiveSegments: 4, negativeSegments: 0
4831: positiveSegments: 8, negativeSegments: 3
4835: positiveSegments: 7, negativeSegments: 6
4837: positiveSegments: 0, negativeSegments: 3
4839: positiveSegments: 10, negativeSegments: 6
4841: positiveSegments: 3, negativeSegments: 0
4843: positiveSegments: 0, negativeSegments: 6
4844: positiveSegments: 0, negativeSegments: 3
4847: positiveSegments: 0, negativeSegments: 3
4851: positiveSegments: 24, negativeSegments: 0
4853: positiveSegments: 7, negativeSegments: 9
4857: positiveSegments: 0, negativeSegments: 3
4858: positiveSegments: 15, negativeSegments: 3
4859: positiveSegments: 0, negativeSegments: 6
4861: positiveSegments: 20, negativeSegments: 0
4865: positiveSegments: 0, negativeSegments: 12
4869: positiveSegments: 10, negativeSegments: 3
4871: positiveSegments: 25, negativeSegments: 3
4872: positiveSegments: 0, negativeSegments: 6
4874: positiveSegments: 0, negativeSegments: 6
4875: positiveSegments: 3, negativeSegments: 3
4879: positiveSegments: 4, negativeSegments: 3
4880: positiveSegments: 6, negativeSegments: 3
4882: positiveSegments: 0, negativeSegments: 6
4883: positiveSegments: 0, negativeSegments: 3
4887: positiveSegments: 16, negativeSegments: 18
4893: positiveSegments: 19, negativeSegments: 0
4894: positiveSegments: 8, negativeSegments: 6
4897: positiveSegments: 6, negativeSegments: 3
4899: positiveSegments: 0, negativeSegments: 9
4902: positiveSegments: 9, negativeSegments: 3
4903: positiveSegments: 0, negativeSegments: 3
4907: positiveSegments: 4, negativeSegments: 6
4911: positiveSegments: 14, negativeSegments: 9
4912: positiveSegments: 8, negativeSegments: 15
4913: positiveSegments: 0, negativeSegments: 3
4914: positiveSegments: 2, negativeSegments: 0
4916: positiveSegments: 17, negativeSegments: 0
4925: positiveSegments: 0, negativeSegments: 6
4929: positiveSegments: 6, negativeSegments: 3
4932: positiveSegments: 0, negativeSegments: 3
4933: positiveSegments: 17, negativeSegments: 0
4934: positiveSegments: 22, negativeSegments: 15
4935: positiveSegments: 0, negativeSegments: 6
4938: positiveSegments: 0, negativeSegments: 3
4939: positiveSegments: 4, negativeSegments: 6
4941: positiveSegments: 0, negativeSegments: 3
4942: positiveSegments: 14, negativeSegments: 3
4943: positiveSegments: 0, negativeSegments: 9
4946: positiveSegments: 4, negativeSegments: 0
4947: positiveSegments: 0, negativeSegments: 18
4949: positiveSegments: 0, negativeSegments: 6
4951: positiveSegments: 0, negativeSegments: 6
4953: positiveSegments: 12, negativeSegments: 6
4954: positiveSegments: 4, negativeSegments: 6
4957: positiveSegments: 4, negativeSegments: 9
4958: positiveSegments: 16, negativeSegments: 6
4959: positiveSegments: 21, negativeSegments: 3
4965: positiveSegments: 0, negativeSegments: 6
4966: positiveSegments: 16, negativeSegments: 3
4971: positiveSegments: 7, negativeSegments: 0
4973: positiveSegments: 4, negativeSegments: 6
4974: positiveSegments: 4, negativeSegments: 6
4976: positiveSegments: 4, negativeSegments: 9
4977: positiveSegments: 7, negativeSegments: 0
4982: positiveSegments: 4, negativeSegments: 3
4987: positiveSegments: 0, negativeSegments: 12
4989: positiveSegments: 4, negativeSegments: 6
4991: positiveSegments: 0, negativeSegments: 6
4992: positiveSegments: 6, negativeSegments: 0
4995: positiveSegments: 3, negativeSegments: 15
4997: positiveSegments: 31, negativeSegments: 3
4999: positiveSegments: 3, negativeSegments: 12
5004: positiveSegments: 6, negativeSegments: 0
5005: positiveSegments: 11, negativeSegments: 0
5006: positiveSegments: 20, negativeSegments: 0
5008: positiveSegments: 0, negativeSegments: 6
5010: positiveSegments: 10, negativeSegments: 15
5014: positiveSegments: 0, negativeSegments: 15
5015: positiveSegments: 0, negativeSegments: 6
5018: positiveSegments: 12, negativeSegments: 0
5019: positiveSegments: 12, negativeSegments: 12
5021: positiveSegments: 0, negativeSegments: 6
5024: positiveSegments: 4, negativeSegments: 3
5027: positiveSegments: 4, negativeSegments: 0
5029: positiveSegments: 0, negativeSegments: 3
5031: positiveSegments: 3, negativeSegments: 0
5033: positiveSegments: 0, negativeSegments: 3
5036: positiveSegments: 25, negativeSegments: 0
5040: positiveSegments: 7, negativeSegments: 3
5043: positiveSegments: 10, negativeSegments: 6
5044: positiveSegments: 12, negativeSegments: 6
5045: positiveSegments: 8, negativeSegments: 3
5046: positiveSegments: 3, negativeSegments: 0
5052: positiveSegments: 13, negativeSegments: 0
5059: positiveSegments: 21, negativeSegments: 0
5060: positiveSegments: 0, negativeSegments: 21
5061: positiveSegments: 1, negativeSegments: 0
5068: positiveSegments: 8, negativeSegments: 3
5070: positiveSegments: 4, negativeSegments: 0
count processed: 2200, current case index: 5072
5072: positiveSegments: 0, negativeSegments: 12
5077: positiveSegments: 3, negativeSegments: 6
5079: positiveSegments: 0, negativeSegments: 6
5080: positiveSegments: 4, negativeSegments: 0
5084: positiveSegments: 12, negativeSegments: 0
5089: positiveSegments: 0, negativeSegments: 12
5090: positiveSegments: 8, negativeSegments: 3
5091: positiveSegments: 0, negativeSegments: 9
5092: positiveSegments: 0, negativeSegments: 3
5093: positiveSegments: 0, negativeSegments: 6
5099: positiveSegments: 22, negativeSegments: 12
5100: positiveSegments: 26, negativeSegments: 0
5104: positiveSegments: 0, negativeSegments: 6
5106: positiveSegments: 4, negativeSegments: 15
5108: positiveSegments: 12, negativeSegments: 3
5109: positiveSegments: 12, negativeSegments: 3
5110: positiveSegments: 7, negativeSegments: 0
5111: positiveSegments: 4, negativeSegments: 9
5112: positiveSegments: 3, negativeSegments: 9
5118: positiveSegments: 10, negativeSegments: 0
5119: positiveSegments: 28, negativeSegments: 3
5122: positiveSegments: 0, negativeSegments: 6
5124: positiveSegments: 0, negativeSegments: 3
5125: positiveSegments: 4, negativeSegments: 0
5126: positiveSegments: 4, negativeSegments: 3
5131: positiveSegments: 14, negativeSegments: 3
5135: positiveSegments: 11, negativeSegments: 9
5137: positiveSegments: 0, negativeSegments: 9
5139: positiveSegments: 16, negativeSegments: 0
5140: positiveSegments: 0, negativeSegments: 9
5148: positiveSegments: 0, negativeSegments: 3
5149: positiveSegments: 4, negativeSegments: 0
5150: positiveSegments: 0, negativeSegments: 6
5152: positiveSegments: 14, negativeSegments: 6
5153: positiveSegments: 4, negativeSegments: 9
5155: positiveSegments: 4, negativeSegments: 3
5156: positiveSegments: 5, negativeSegments: 0
5157: positiveSegments: 2, negativeSegments: 0
5162: positiveSegments: 3, negativeSegments: 9
5163: positiveSegments: 11, negativeSegments: 9
5166: positiveSegments: 4, negativeSegments: 12
5173: positiveSegments: 26, negativeSegments: 0
5174: positiveSegments: 17, negativeSegments: 0
5178: positiveSegments: 5, negativeSegments: 6
5180: positiveSegments: 0, negativeSegments: 12
5182: positiveSegments: 0, negativeSegments: 6
5184: positiveSegments: 8, negativeSegments: 3
5185: positiveSegments: 6, negativeSegments: 0
5187: positiveSegments: 5, negativeSegments: 0
5192: positiveSegments: 0, negativeSegments: 3
5193: positiveSegments: 0, negativeSegments: 6
5195: positiveSegments: 4, negativeSegments: 3
5198: positiveSegments: 10, negativeSegments: 6
5201: positiveSegments: 28, negativeSegments: 0
5202: positiveSegments: 0, negativeSegments: 9
5203: positiveSegments: 12, negativeSegments: 12
5204: positiveSegments: 3, negativeSegments: 0
5213: positiveSegments: 4, negativeSegments: 6
5214: positiveSegments: 0, negativeSegments: 3
5217: positiveSegments: 6, negativeSegments: 0
5221: positiveSegments: 8, negativeSegments: 0
5222: positiveSegments: 9, negativeSegments: 9
5224: positiveSegments: 8, negativeSegments: 15
5225: positiveSegments: 8, negativeSegments: 0
5226: positiveSegments: 2, negativeSegments: 0
5229: positiveSegments: 8, negativeSegments: 6
5230: positiveSegments: 5, negativeSegments: 0
5232: positiveSegments: 0, negativeSegments: 12
5235: positiveSegments: 4, negativeSegments: 18
5237: positiveSegments: 8, negativeSegments: 0
5245: positiveSegments: 21, negativeSegments: 6
5246: positiveSegments: 4, negativeSegments: 3
5247: positiveSegments: 8, negativeSegments: 9
5255: positiveSegments: 0, negativeSegments: 9
5259: positiveSegments: 11, negativeSegments: 0
5264: positiveSegments: 0, negativeSegments: 15
5266: positiveSegments: 4, negativeSegments: 0
5267: positiveSegments: 2, negativeSegments: 6
5270: positiveSegments: 6, negativeSegments: 0
5271: positiveSegments: 0, negativeSegments: 15
5277: positiveSegments: 4, negativeSegments: 12
5278: positiveSegments: 0, negativeSegments: 9
5280: positiveSegments: 4, negativeSegments: 0
5283: positiveSegments: 0, negativeSegments: 3
5284: positiveSegments: 15, negativeSegments: 0
5285: positiveSegments: 8, negativeSegments: 0
5288: positiveSegments: 4, negativeSegments: 3
5292: positiveSegments: 2, negativeSegments: 0
5295: positiveSegments: 4, negativeSegments: 9
5296: positiveSegments: 4, negativeSegments: 6
5297: positiveSegments: 3, negativeSegments: 0
5298: positiveSegments: 28, negativeSegments: 6
5299: positiveSegments: 4, negativeSegments: 0
5302: positiveSegments: 8, negativeSegments: 0
5304: positiveSegments: 22, negativeSegments: 0
5305: positiveSegments: 18, negativeSegments: 6
5308: positiveSegments: 0, negativeSegments: 15
5309: positiveSegments: 0, negativeSegments: 6
5310: positiveSegments: 11, negativeSegments: 0
5311: positiveSegments: 16, negativeSegments: 3
count processed: 2300, current case index: 5314
5314: positiveSegments: 0, negativeSegments: 15
5317: positiveSegments: 4, negativeSegments: 12
5319: positiveSegments: 9, negativeSegments: 9
5321: positiveSegments: 0, negativeSegments: 12
5322: positiveSegments: 10, negativeSegments: 6
5323: positiveSegments: 4, negativeSegments: 3
5324: positiveSegments: 4, negativeSegments: 0
5332: positiveSegments: 0, negativeSegments: 3
5342: positiveSegments: 0, negativeSegments: 6
5344: positiveSegments: 0, negativeSegments: 6
5346: positiveSegments: 28, negativeSegments: 6
5347: positiveSegments: 11, negativeSegments: 12
5349: positiveSegments: 7, negativeSegments: 6
5352: positiveSegments: 0, negativeSegments: 12
5353: positiveSegments: 8, negativeSegments: 6
5359: positiveSegments: 0, negativeSegments: 3
5360: positiveSegments: 11, negativeSegments: 21
5362: positiveSegments: 20, negativeSegments: 3
5363: positiveSegments: 8, negativeSegments: 12
5364: positiveSegments: 0, negativeSegments: 6
5367: positiveSegments: 12, negativeSegments: 3
5368: positiveSegments: 4, negativeSegments: 0
5371: positiveSegments: 4, negativeSegments: 6
5372: positiveSegments: 8, negativeSegments: 3
5379: positiveSegments: 5, negativeSegments: 3
5383: positiveSegments: 0, negativeSegments: 6
5387: positiveSegments: 10, negativeSegments: 9
5388: positiveSegments: 0, negativeSegments: 9
5393: positiveSegments: 0, negativeSegments: 3
5394: positiveSegments: 4, negativeSegments: 0
5395: positiveSegments: 0, negativeSegments: 6
5396: positiveSegments: 14, negativeSegments: 18
5397: positiveSegments: 2, negativeSegments: 0
5399: positiveSegments: 2, negativeSegments: 3
5403: positiveSegments: 8, negativeSegments: 6
5406: positiveSegments: 0, negativeSegments: 3
5411: positiveSegments: 3, negativeSegments: 0
5415: positiveSegments: 13, negativeSegments: 0
5416: positiveSegments: 38, negativeSegments: 3
5420: positiveSegments: 11, negativeSegments: 3
5421: positiveSegments: 26, negativeSegments: 6
5423: positiveSegments: 22, negativeSegments: 0
5425: positiveSegments: 4, negativeSegments: 9
5427: positiveSegments: 4, negativeSegments: 3
5431: positiveSegments: 0, negativeSegments: 3
5434: positiveSegments: 3, negativeSegments: 3
5442: positiveSegments: 4, negativeSegments: 3
5443: positiveSegments: 10, negativeSegments: 6
5446: positiveSegments: 4, negativeSegments: 0
5449: positiveSegments: 4, negativeSegments: 0
5451: positiveSegments: 16, negativeSegments: 3
5453: positiveSegments: 5, negativeSegments: 0
5454: positiveSegments: 7, negativeSegments: 0
5456: positiveSegments: 0, negativeSegments: 6
5458: positiveSegments: 32, negativeSegments: 9
5460: positiveSegments: 2, negativeSegments: 0
5462: positiveSegments: 0, negativeSegments: 6
5463: positiveSegments: 0, negativeSegments: 6
5467: positiveSegments: 13, negativeSegments: 3
5474: positiveSegments: 4, negativeSegments: 3
5475: positiveSegments: 0, negativeSegments: 9
5478: positiveSegments: 4, negativeSegments: 0
5479: positiveSegments: 0, negativeSegments: 6
5480: positiveSegments: 0, negativeSegments: 9
5484: positiveSegments: 0, negativeSegments: 6
5486: positiveSegments: 8, negativeSegments: 3
5487: positiveSegments: 4, negativeSegments: 6
5490: positiveSegments: 7, negativeSegments: 0
5492: positiveSegments: 4, negativeSegments: 6
5495: positiveSegments: 4, negativeSegments: 9
5497: positiveSegments: 26, negativeSegments: 3
5499: positiveSegments: 0, negativeSegments: 3
5500: positiveSegments: 1, negativeSegments: 0
5502: positiveSegments: 5, negativeSegments: 0
5505: positiveSegments: 7, negativeSegments: 3
5507: positiveSegments: 0, negativeSegments: 9
5508: positiveSegments: 0, negativeSegments: 12
5509: positiveSegments: 0, negativeSegments: 6
5511: positiveSegments: 6, negativeSegments: 3
5513: positiveSegments: 0, negativeSegments: 9
5515: positiveSegments: 3, negativeSegments: 12
5516: positiveSegments: 18, negativeSegments: 6
5517: positiveSegments: 4, negativeSegments: 6
5519: positiveSegments: 3, negativeSegments: 9
5520: positiveSegments: 0, negativeSegments: 3
5524: positiveSegments: 6, negativeSegments: 3
5526: positiveSegments: 0, negativeSegments: 6
5531: positiveSegments: 8, negativeSegments: 9
5533: positiveSegments: 0, negativeSegments: 3
5534: positiveSegments: 23, negativeSegments: 15
5536: positiveSegments: 0, negativeSegments: 3
5537: positiveSegments: 4, negativeSegments: 6
5538: positiveSegments: 8, negativeSegments: 0
5541: positiveSegments: 0, negativeSegments: 3
5546: positiveSegments: 6, negativeSegments: 0
5548: positiveSegments: 4, negativeSegments: 3
5552: positiveSegments: 22, negativeSegments: 0
5561: positiveSegments: 0, negativeSegments: 6
5564: positiveSegments: 14, negativeSegments: 0
5566: positiveSegments: 0, negativeSegments: 9
count processed: 2400, current case index: 5568
5568: positiveSegments: 12, negativeSegments: 3
5571: positiveSegments: 0, negativeSegments: 3
5572: positiveSegments: 0, negativeSegments: 9
5573: positiveSegments: 5, negativeSegments: 0
5574: positiveSegments: 8, negativeSegments: 9
5578: positiveSegments: 0, negativeSegments: 3
5582: positiveSegments: 8, negativeSegments: 15
5583: positiveSegments: 8, negativeSegments: 3
5585: positiveSegments: 10, negativeSegments: 3
5589: positiveSegments: 4, negativeSegments: 6
5593: positiveSegments: 11, negativeSegments: 3
5594: positiveSegments: 0, negativeSegments: 3
5595: positiveSegments: 24, negativeSegments: 3
5597: positiveSegments: 4, negativeSegments: 0
5598: positiveSegments: 6, negativeSegments: 0
5600: positiveSegments: 8, negativeSegments: 0
5601: positiveSegments: 6, negativeSegments: 6
5602: positiveSegments: 0, negativeSegments: 12
5603: positiveSegments: 4, negativeSegments: 6
5607: positiveSegments: 25, negativeSegments: 9
5610: positiveSegments: 4, negativeSegments: 0
5612: positiveSegments: 18, negativeSegments: 3
5613: positiveSegments: 8, negativeSegments: 9
5614: positiveSegments: 2, negativeSegments: 0
5616: positiveSegments: 0, negativeSegments: 9
5618: positiveSegments: 0, negativeSegments: 18
5620: positiveSegments: 10, negativeSegments: 3
5621: positiveSegments: 10, negativeSegments: 12
5624: positiveSegments: 12, negativeSegments: 3
5626: positiveSegments: 8, negativeSegments: 3
5627: positiveSegments: 0, negativeSegments: 15
5629: positiveSegments: 4, negativeSegments: 0
5630: positiveSegments: 4, negativeSegments: 12
5633: positiveSegments: 13, negativeSegments: 0
5635: positiveSegments: 4, negativeSegments: 0
5637: positiveSegments: 4, negativeSegments: 15
5638: positiveSegments: 15, negativeSegments: 0
5641: positiveSegments: 8, negativeSegments: 6
5642: positiveSegments: 4, negativeSegments: 0
5646: positiveSegments: 0, negativeSegments: 3
5647: positiveSegments: 0, negativeSegments: 3
5648: positiveSegments: 8, negativeSegments: 6
5650: positiveSegments: 5, negativeSegments: 3
5654: positiveSegments: 0, negativeSegments: 3
5655: positiveSegments: 8, negativeSegments: 3
5657: positiveSegments: 8, negativeSegments: 3
5658: positiveSegments: 12, negativeSegments: 3
5659: positiveSegments: 0, negativeSegments: 12
5662: positiveSegments: 0, negativeSegments: 3
5665: positiveSegments: 4, negativeSegments: 15
5669: positiveSegments: 8, negativeSegments: 12
5670: positiveSegments: 12, negativeSegments: 12
5671: positiveSegments: 0, negativeSegments: 6
5673: positiveSegments: 12, negativeSegments: 0
5675: positiveSegments: 4, negativeSegments: 9
5677: positiveSegments: 1, negativeSegments: 6
5678: positiveSegments: 4, negativeSegments: 27
5680: positiveSegments: 0, negativeSegments: 6
5682: positiveSegments: 15, negativeSegments: 0
5684: positiveSegments: 10, negativeSegments: 6
5685: positiveSegments: 5, negativeSegments: 6
5687: positiveSegments: 22, negativeSegments: 6
5690: positiveSegments: 4, negativeSegments: 9
5691: positiveSegments: 2, negativeSegments: 0
5692: positiveSegments: 0, negativeSegments: 6
5694: positiveSegments: 0, negativeSegments: 9
5696: positiveSegments: 7, negativeSegments: 0
5698: positiveSegments: 35, negativeSegments: 0
5703: positiveSegments: 4, negativeSegments: 6
5715: positiveSegments: 12, negativeSegments: 18
5718: positiveSegments: 0, negativeSegments: 3
5719: positiveSegments: 2, negativeSegments: 3
5721: positiveSegments: 0, negativeSegments: 3
5724: positiveSegments: 0, negativeSegments: 18
5725: positiveSegments: 6, negativeSegments: 9
5727: positiveSegments: 0, negativeSegments: 6
5729: positiveSegments: 0, negativeSegments: 15
5733: positiveSegments: 0, negativeSegments: 15
5743: positiveSegments: 0, negativeSegments: 3
5745: positiveSegments: 7, negativeSegments: 0
5746: positiveSegments: 24, negativeSegments: 3
5749: positiveSegments: 4, negativeSegments: 6
5750: positiveSegments: 8, negativeSegments: 0
5751: positiveSegments: 4, negativeSegments: 12
5753: positiveSegments: 5, negativeSegments: 0
5755: positiveSegments: 4, negativeSegments: 0
5759: positiveSegments: 4, negativeSegments: 6
5760: positiveSegments: 21, negativeSegments: 18
5765: positiveSegments: 6, negativeSegments: 0
5769: positiveSegments: 0, negativeSegments: 12
5771: positiveSegments: 25, negativeSegments: 3
5772: positiveSegments: 0, negativeSegments: 9
5777: positiveSegments: 0, negativeSegments: 9
5780: positiveSegments: 8, negativeSegments: 6
5781: positiveSegments: 3, negativeSegments: 3
5782: exit early, no segments to save
5782: positiveSegments: 0, negativeSegments: 0
5783: positiveSegments: 8, negativeSegments: 15
5784: positiveSegments: 9, negativeSegments: 0
5787: positiveSegments: 53, negativeSegments: 0
5788: positiveSegments: 4, negativeSegments: 12
count processed: 2500, current case index: 5793
5793: positiveSegments: 0, negativeSegments: 12
5795: positiveSegments: 11, negativeSegments: 6
5799: positiveSegments: 4, negativeSegments: 0
5801: positiveSegments: 0, negativeSegments: 3
5805: positiveSegments: 2, negativeSegments: 0
5806: positiveSegments: 16, negativeSegments: 3
5808: positiveSegments: 0, negativeSegments: 3
5809: positiveSegments: 3, negativeSegments: 21
5811: positiveSegments: 2, negativeSegments: 0
5816: positiveSegments: 0, negativeSegments: 12
5819: positiveSegments: 12, negativeSegments: 15
5825: positiveSegments: 7, negativeSegments: 12
5826: positiveSegments: 13, negativeSegments: 12
5827: positiveSegments: 8, negativeSegments: 6
5829: positiveSegments: 6, negativeSegments: 6
5831: positiveSegments: 21, negativeSegments: 3
5832: positiveSegments: 0, negativeSegments: 6
5834: positiveSegments: 0, negativeSegments: 6
5837: positiveSegments: 7, negativeSegments: 6
5839: positiveSegments: 0, negativeSegments: 9
5840: positiveSegments: 0, negativeSegments: 21
5842: positiveSegments: 0, negativeSegments: 9
5843: positiveSegments: 4, negativeSegments: 12
5844: positiveSegments: 12, negativeSegments: 6
5848: positiveSegments: 4, negativeSegments: 0
5849: positiveSegments: 18, negativeSegments: 0
5851: positiveSegments: 0, negativeSegments: 9
5859: positiveSegments: 21, negativeSegments: 9
5860: positiveSegments: 12, negativeSegments: 9
5861: positiveSegments: 3, negativeSegments: 0
5862: positiveSegments: 33, negativeSegments: 0
5864: positiveSegments: 0, negativeSegments: 3
5865: positiveSegments: 15, negativeSegments: 3
5866: positiveSegments: 15, negativeSegments: 9
5869: positiveSegments: 8, negativeSegments: 9
5870: positiveSegments: 0, negativeSegments: 9
5871: exit early, no segments to save
5871: positiveSegments: 0, negativeSegments: 0
5872: positiveSegments: 19, negativeSegments: 6
5873: positiveSegments: 7, negativeSegments: 3
5875: positiveSegments: 26, negativeSegments: 3
5884: positiveSegments: 5, negativeSegments: 0
5887: positiveSegments: 8, negativeSegments: 0
5888: positiveSegments: 0, negativeSegments: 3
5889: positiveSegments: 0, negativeSegments: 3
5890: positiveSegments: 7, negativeSegments: 3
5891: positiveSegments: 0, negativeSegments: 3
5892: positiveSegments: 0, negativeSegments: 3
5894: positiveSegments: 7, negativeSegments: 0
5895: positiveSegments: 4, negativeSegments: 9
5902: positiveSegments: 7, negativeSegments: 3
5904: positiveSegments: 8, negativeSegments: 6
5907: positiveSegments: 0, negativeSegments: 3
5911: positiveSegments: 6, negativeSegments: 3
5912: positiveSegments: 4, negativeSegments: 6
5914: positiveSegments: 2, negativeSegments: 3
5917: positiveSegments: 7, negativeSegments: 0
5918: positiveSegments: 0, negativeSegments: 9
5933: positiveSegments: 4, negativeSegments: 0
5934: positiveSegments: 0, negativeSegments: 3
5937: positiveSegments: 7, negativeSegments: 0
5938: positiveSegments: 4, negativeSegments: 6
5940: positiveSegments: 4, negativeSegments: 0
5942: positiveSegments: 6, negativeSegments: 3
5943: positiveSegments: 0, negativeSegments: 3
5944: positiveSegments: 4, negativeSegments: 9
5945: positiveSegments: 2, negativeSegments: 0
5946: positiveSegments: 12, negativeSegments: 6
5948: positiveSegments: 7, negativeSegments: 0
5950: positiveSegments: 9, negativeSegments: 9
5951: positiveSegments: 0, negativeSegments: 3
5954: positiveSegments: 8, negativeSegments: 0
5956: positiveSegments: 4, negativeSegments: 6
5958: positiveSegments: 0, negativeSegments: 12
5961: positiveSegments: 0, negativeSegments: 3
5964: positiveSegments: 0, negativeSegments: 15
5965: positiveSegments: 8, negativeSegments: 0
5966: positiveSegments: 4, negativeSegments: 6
5967: positiveSegments: 21, negativeSegments: 9
5970: positiveSegments: 0, negativeSegments: 3
5973: positiveSegments: 0, negativeSegments: 9
5975: positiveSegments: 9, negativeSegments: 3
5976: positiveSegments: 14, negativeSegments: 0
5977: positiveSegments: 8, negativeSegments: 9
5981: positiveSegments: 15, negativeSegments: 0
5982: positiveSegments: 4, negativeSegments: 0
5983: positiveSegments: 17, negativeSegments: 0
5986: positiveSegments: 4, negativeSegments: 9
5987: positiveSegments: 5, negativeSegments: 0
5989: positiveSegments: 5, negativeSegments: 3
5993: positiveSegments: 22, negativeSegments: 9
5994: positiveSegments: 4, negativeSegments: 0
5997: positiveSegments: 5, negativeSegments: 6
6000: positiveSegments: 0, negativeSegments: 21
6003: positiveSegments: 8, negativeSegments: 3
6007: positiveSegments: 14, negativeSegments: 3
6009: positiveSegments: 22, negativeSegments: 6
6010: positiveSegments: 4, negativeSegments: 12
6013: positiveSegments: 1, negativeSegments: 0
6015: positiveSegments: 4, negativeSegments: 0
6016: positiveSegments: 0, negativeSegments: 3
count processed: 2600, current case index: 6017
6017: positiveSegments: 0, negativeSegments: 9
6020: positiveSegments: 0, negativeSegments: 18
6022: positiveSegments: 8, negativeSegments: 0
6028: positiveSegments: 6, negativeSegments: 0
6029: positiveSegments: 0, negativeSegments: 6
6031: positiveSegments: 8, negativeSegments: 3
6037: positiveSegments: 4, negativeSegments: 6
6039: positiveSegments: 13, negativeSegments: 12
6043: positiveSegments: 14, negativeSegments: 6
6047: positiveSegments: 0, negativeSegments: 9
6053: positiveSegments: 0, negativeSegments: 3
6058: positiveSegments: 28, negativeSegments: 9
6059: positiveSegments: 0, negativeSegments: 9
6060: positiveSegments: 0, negativeSegments: 6
6061: positiveSegments: 4, negativeSegments: 12
6063: positiveSegments: 10, negativeSegments: 9
6065: positiveSegments: 0, negativeSegments: 3
6066: positiveSegments: 18, negativeSegments: 0
6069: positiveSegments: 22, negativeSegments: 12
6070: positiveSegments: 0, negativeSegments: 6
6071: positiveSegments: 22, negativeSegments: 0
6074: positiveSegments: 4, negativeSegments: 18
6076: positiveSegments: 8, negativeSegments: 6
6077: positiveSegments: 13, negativeSegments: 6
6080: positiveSegments: 2, negativeSegments: 0
6082: positiveSegments: 25, negativeSegments: 3
6083: positiveSegments: 9, negativeSegments: 9
6084: positiveSegments: 25, negativeSegments: 9
6085: positiveSegments: 6, negativeSegments: 3
6086: positiveSegments: 16, negativeSegments: 6
6087: positiveSegments: 2, negativeSegments: 3
6089: positiveSegments: 0, negativeSegments: 6
6098: positiveSegments: 0, negativeSegments: 3
6101: positiveSegments: 4, negativeSegments: 6
6102: positiveSegments: 7, negativeSegments: 0
6103: positiveSegments: 0, negativeSegments: 12
6104: positiveSegments: 0, negativeSegments: 6
6114: positiveSegments: 23, negativeSegments: 0
6119: positiveSegments: 4, negativeSegments: 12
6121: positiveSegments: 16, negativeSegments: 3
6124: positiveSegments: 8, negativeSegments: 0
6126: positiveSegments: 0, negativeSegments: 15
6127: positiveSegments: 6, negativeSegments: 0
6129: positiveSegments: 8, negativeSegments: 12
6132: positiveSegments: 4, negativeSegments: 0
6133: positiveSegments: 0, negativeSegments: 9
6134: positiveSegments: 0, negativeSegments: 9
6135: positiveSegments: 0, negativeSegments: 3
6136: positiveSegments: 6, negativeSegments: 6
6140: positiveSegments: 0, negativeSegments: 3
6141: positiveSegments: 10, negativeSegments: 0
6143: positiveSegments: 4, negativeSegments: 9
6144: positiveSegments: 5, negativeSegments: 0
6147: positiveSegments: 0, negativeSegments: 15
6152: positiveSegments: 15, negativeSegments: 0
6153: positiveSegments: 4, negativeSegments: 6
6154: positiveSegments: 3, negativeSegments: 0
6156: positiveSegments: 0, negativeSegments: 6
6159: positiveSegments: 36, negativeSegments: 3
6160: positiveSegments: 24, negativeSegments: 0
6163: positiveSegments: 13, negativeSegments: 3
6166: positiveSegments: 24, negativeSegments: 3
6167: positiveSegments: 0, negativeSegments: 12
6168: positiveSegments: 0, negativeSegments: 3
6169: positiveSegments: 0, negativeSegments: 12
6171: positiveSegments: 12, negativeSegments: 9
6174: positiveSegments: 0, negativeSegments: 6
6176: positiveSegments: 11, negativeSegments: 9
6178: positiveSegments: 17, negativeSegments: 3
6179: positiveSegments: 30, negativeSegments: 9
6182: positiveSegments: 8, negativeSegments: 9
6184: positiveSegments: 0, negativeSegments: 9
6185: positiveSegments: 4, negativeSegments: 6
6186: positiveSegments: 0, negativeSegments: 6
6190: positiveSegments: 3, negativeSegments: 9
6191: positiveSegments: 8, negativeSegments: 3
6194: positiveSegments: 0, negativeSegments: 6
6195: positiveSegments: 8, negativeSegments: 3
6196: positiveSegments: 0, negativeSegments: 9
6198: positiveSegments: 0, negativeSegments: 3
6199: positiveSegments: 4, negativeSegments: 6
6200: positiveSegments: 0, negativeSegments: 3
6206: positiveSegments: 4, negativeSegments: 0
6208: positiveSegments: 0, negativeSegments: 3
6210: positiveSegments: 0, negativeSegments: 3
6214: positiveSegments: 5, negativeSegments: 6
6217: positiveSegments: 29, negativeSegments: 3
6218: positiveSegments: 4, negativeSegments: 6
6219: positiveSegments: 4, negativeSegments: 9
6220: positiveSegments: 25, negativeSegments: 12
6224: positiveSegments: 0, negativeSegments: 6
6227: positiveSegments: 10, negativeSegments: 0
6228: positiveSegments: 4, negativeSegments: 0
6230: positiveSegments: 4, negativeSegments: 3
6233: positiveSegments: 1, negativeSegments: 0
6235: positiveSegments: 0, negativeSegments: 3
6238: positiveSegments: 3, negativeSegments: 3
6239: positiveSegments: 8, negativeSegments: 3
6240: positiveSegments: 4, negativeSegments: 0
6241: positiveSegments: 4, negativeSegments: 9
count processed: 2700, current case index: 6248
6248: positiveSegments: 12, negativeSegments: 3
6250: positiveSegments: 0, negativeSegments: 6
6254: positiveSegments: 4, negativeSegments: 6
6255: positiveSegments: 4, negativeSegments: 12
6257: positiveSegments: 0, negativeSegments: 21
6260: positiveSegments: 9, negativeSegments: 0
6262: positiveSegments: 4, negativeSegments: 3
6264: positiveSegments: 0, negativeSegments: 18
6266: positiveSegments: 11, negativeSegments: 3
6267: positiveSegments: 4, negativeSegments: 6
6268: positiveSegments: 0, negativeSegments: 3
6269: positiveSegments: 6, negativeSegments: 0
6270: positiveSegments: 0, negativeSegments: 15
6273: positiveSegments: 0, negativeSegments: 6
6275: positiveSegments: 4, negativeSegments: 0
6279: positiveSegments: 0, negativeSegments: 12
6280: positiveSegments: 0, negativeSegments: 9
6281: positiveSegments: 23, negativeSegments: 0
6282: positiveSegments: 8, negativeSegments: 6
6284: positiveSegments: 5, negativeSegments: 3
6286: positiveSegments: 0, negativeSegments: 6
6289: positiveSegments: 4, negativeSegments: 0
6290: positiveSegments: 14, negativeSegments: 0
6292: positiveSegments: 10, negativeSegments: 6
6295: positiveSegments: 4, negativeSegments: 6
6296: positiveSegments: 13, negativeSegments: 6
6297: positiveSegments: 26, negativeSegments: 0
6298: positiveSegments: 0, negativeSegments: 6
6302: positiveSegments: 2, negativeSegments: 0
6305: positiveSegments: 0, negativeSegments: 9
6306: positiveSegments: 0, negativeSegments: 6
6307: positiveSegments: 0, negativeSegments: 3
6309: positiveSegments: 48, negativeSegments: 3
6311: positiveSegments: 11, negativeSegments: 3
6312: positiveSegments: 10, negativeSegments: 0
6314: positiveSegments: 4, negativeSegments: 3
6316: positiveSegments: 15, negativeSegments: 3
6317: positiveSegments: 18, negativeSegments: 0
6330: positiveSegments: 0, negativeSegments: 3
6331: exit early, no segments to save
6331: positiveSegments: 0, negativeSegments: 0
6332: positiveSegments: 4, negativeSegments: 15
6339: positiveSegments: 0, negativeSegments: 3
6343: positiveSegments: 4, negativeSegments: 15
6345: positiveSegments: 0, negativeSegments: 9
6346: positiveSegments: 0, negativeSegments: 9
6351: positiveSegments: 19, negativeSegments: 9
6355: positiveSegments: 2, negativeSegments: 6
6357: positiveSegments: 7, negativeSegments: 12
6359: positiveSegments: 2, negativeSegments: 0
6360: positiveSegments: 8, negativeSegments: 0
6361: positiveSegments: 12, negativeSegments: 3
6362: positiveSegments: 12, negativeSegments: 3
6363: positiveSegments: 8, negativeSegments: 15
6366: positiveSegments: 0, negativeSegments: 6
6370: positiveSegments: 0, negativeSegments: 3
6372: positiveSegments: 0, negativeSegments: 9
6375: positiveSegments: 8, negativeSegments: 0
6376: positiveSegments: 10, negativeSegments: 0
6378: positiveSegments: 8, negativeSegments: 0
6381: positiveSegments: 0, negativeSegments: 3
6383: positiveSegments: 0, negativeSegments: 12
6385: positiveSegments: 24, negativeSegments: 0
6386: positiveSegments: 8, negativeSegments: 15
extracted: 2763

Track and Segment Validity Checks¶

In [42]:
def printAbp(case_id_to_check, plot_invalid_only=False):
        vf_path = f'{VITAL_MINI}/{case_id_to_check:04d}_mini.vital'
        
        if not os.path.isfile(vf_path):
              return
        
        vf = vitaldb.VitalFile(vf_path)
        abp = vf.to_numpy(TRACK_NAMES[0], 1/500)
        
        print(f'Case {case_id_to_check}')
        print(f'ABP Shape: {abp.shape}')

        print(f'nanmin: {np.nanmin(abp)}')
        print(f'nanmean: {np.nanmean(abp)}')
        print(f'nanmax: {np.nanmax(abp)}')
        
        is_valid = isAbpSegmentValidNumpy(abp, debug=True)
        print(f'valid: {is_valid}')

        if plot_invalid_only and is_valid:
            return
            
        plt.figure(figsize=(20, 5))
        plt_color = 'C0' if is_valid else 'red'
        plt.plot(abp, plt_color)
        plt.title(f'ABP - Entire Track - Case {case_id_to_check} - {abp.shape[0] / 500} seconds')
        plt.axhline(y = 65, color = 'maroon', linestyle = '--')
        plt.show()
In [43]:
def printSegments(segmentsMap, case_id_to_check, print_label, normalize=False):
    for (x1, x2, r, abp, ecg, eeg) in segmentsMap[case_id_to_check]:
        print(f'{print_label}: Case {case_id_to_check}')
        print(f'lookback window: {r} min')
        print(f'start time: {x1}')
        print(f'end time: {x2}')
        print(f'length: {x2 - x1} sec')
        
        print(f'ABP Shape: {abp.shape}')
        print(f'ECG Shape: {ecg.shape}')
        print(f'EEG Shape: {eeg.shape}')

        print(f'nanmin: {np.nanmin(abp)}')
        print(f'nanmean: {np.nanmean(abp)}')
        print(f'nanmax: {np.nanmax(abp)}')
        
        is_valid = isAbpSegmentValidNumpy(abp, debug=True)
        print(f'valid: {is_valid}')

        # ABP normalization
        x_abp = np.copy(abp)
        if normalize:
            x_abp -= 65
            x_abp /= 65

        plt.figure(figsize=(20, 5))
        plt_color = 'C0' if is_valid else 'red'
        plt.plot(x_abp, plt_color)
        plt.title('ABP')
        plt.axhline(y = 65, color = 'maroon', linestyle = '--')
        plt.show()

        plt.figure(figsize=(20, 5))
        plt.plot(ecg, 'teal')
        plt.title('ECG')
        plt.show()

        plt.figure(figsize=(20, 5))
        plt.plot(eeg, 'indigo')
        plt.title('EEG')
        plt.show()

        print()
In [44]:
def printEvents(abp_raw, eventsMap, case_id_to_check, print_label, normalize=False):
    for (x1, x2) in eventsMap[case_id_to_check]:
        print(f'{print_label}: Case {case_id_to_check}')
        print(f'start time: {x1}')
        print(f'end time: {x2}')
        print(f'length: {x2 - x1} sec')

        abp = abp_raw[x1*500:x2*500]
        print(f'ABP Shape: {abp.shape}')

        print(f'nanmin: {np.nanmin(abp)}')
        print(f'nanmean: {np.nanmean(abp)}')
        print(f'nanmax: {np.nanmax(abp)}')
        
        is_valid = isAbpSegmentValidNumpy(abp, debug=True)
        print(f'valid: {is_valid}')

        # ABP normalization
        x_abp = np.copy(abp)
        if normalize:
            x_abp -= 65
            x_abp /= 65

        plt.figure(figsize=(20, 5))
        plt_color = 'C0' if is_valid else 'red'
        plt.plot(x_abp, plt_color)
        plt.title('ABP')
        plt.axhline(y = 65, color = 'maroon', linestyle = '--')
        plt.show()

        print()
In [45]:
def moving_average(x, seconds=60):
    w = seconds * 500
    return np.convolve(np.squeeze(x), np.ones(w), 'valid') / w
In [46]:
def printAbpOverlay(
    case_id_to_check,
    positiveSegmentsMap,
    negativeSegmentsMap,
    iohEventsMap,
    cleanEventsMap,
    movingAverage=False
):
    def overlay_segments(plt, segmentsMap, color, linestyle, positive=False):
        for (x1, x2, r, abp, ecg, eeg) in segmentsMap:
            sx1 = x1*500
            sx2 = x2*500
            mycolor = color
            if positive:
                if r == 3:
                    mycolor = 'red'
                elif r == 5:
                    mycolor = 'crimson'
                elif r == 10:
                    mycolor = 'tomato'
                else:
                    mycolor = 'salmon'
            plt.axvline(x = sx1, color = mycolor, linestyle = linestyle)
            plt.axvline(x = sx2, color = mycolor, linestyle = linestyle)
            plt.axvspan(sx1, sx2, facecolor = mycolor, alpha = 0.1)

    def overlay_events(plt, abp, eventsMap, opstart, opend, color, linestyle):
        for (x1, x2) in eventsMap:
            sx1 = x1*500
            sx2 = x2*500
            # only plot valid events
            if isAbpSegmentValidNumpy(abp[sx1:sx2]):
                # that are within the operating start and end times
                if sx1 >= opstart and sx2 <= opend:
                    plt.axvline(x = sx1, color = color, linestyle = linestyle)
                    plt.axvline(x = sx2, color = color, linestyle = linestyle)
                    plt.axvspan(sx1, sx2, facecolor = color, alpha = 0.1)

    vf_path = f'{VITAL_MINI}/{case_id_to_check:04d}_mini.vital'

    if not os.path.isfile(vf_path):
          return

    vf = vitaldb.VitalFile(vf_path)
    abp = vf.to_numpy(TRACK_NAMES[0], 1/500)

    print(f'Case {case_id_to_check}')
    print(f'ABP Shape: {abp.shape}')

    print(f'nanmin: {np.nanmin(abp)}')
    print(f'nanmean: {np.nanmean(abp)}')
    print(f'nanmax: {np.nanmax(abp)}')

    #is_valid = isAbpSegmentValidNumpy(abp, debug=True)
    #print(f'valid: {is_valid}')

    plt.figure(figsize=(24, 8))
    plt_color = 'C0' #if is_valid else 'red'
    plt.plot(abp, plt_color)
    plt.title(f'ABP - Entire Track - Case {case_id_to_check} - {abp.shape[0] / 500} seconds')
    plt.axhline(y = 65, color = 'maroon', linestyle = '--')

    # https://matplotlib.org/stable/gallery/lines_bars_and_markers/linestyles.html#linestyles
    
    opstart = cases.loc[case_id_to_check]['opstart'].item() * 500
    plt.axvline(x = opstart, color = 'black', linestyle = '--', linewidth=2)
    plt.text(opstart - 600000, -200, f'Operation Start', fontsize=15)
    
    opend = cases.loc[case_id_to_check]['opend'].item() * 500
    plt.axvline(x = opend, color = 'black', linestyle = '--', linewidth=2)
    plt.text(opend + 50000, -200, r'Operation End', fontsize=15)
    
    overlay_segments(plt, positiveSegmentsMap[case_id_to_check], 'crimson', (0, (1, 1)), positive=True)
    
    overlay_segments(plt, negativeSegmentsMap[case_id_to_check], 'teal', (0, (1, 1)))

    overlay_events(plt, abp, iohEventsMap[case_id_to_check], opstart, opend, 'brown', '-')
    
    overlay_events(plt, abp, cleanEventsMap[case_id_to_check], opstart, opend, 'teal', '-')
    
    abp_mov_avg = None
    if movingAverage:
        abp_mov_avg = moving_average(abp[opstart:(opend + 60*500)])
        myx = np.arange(opstart, opstart + len(abp_mov_avg), 1)
        plt.plot(myx, abp_mov_avg, 'red')

    plt.show()

Reality Check All Cases¶

In [47]:
# Global flag to control creating track and segment plots.
# These plots are expensive to create, but very interesting.
# Disable when training in bulk to speed up notebook processing.
PERFORM_TRACK_VALIDITY_CHECKS = False
In [48]:
# Check if all ABPs are well formed. Fast load and scan of the raw track data for ABP.
DISPLAY_REALITY_CHECK_ABP=True
DISPLAY_REALITY_CHECK_ABP_FIRST_ONLY=True

if PERFORM_TRACK_VALIDITY_CHECKS and DISPLAY_REALITY_CHECK_ABP:
    for case_id_to_check in cases_of_interest_idx:
        printAbp(case_id_to_check, plot_invalid_only=False)
        
        if DISPLAY_REALITY_CHECK_ABP_FIRST_ONLY:
            break

Validate Malformed Vital Files - Missing One Or More Tracks¶

In [49]:
# These are Vital Files removed because of malformed ABP waveforms.
DISPLAY_MALFORMED_ABP=True
DISPLAY_MALFORMED_ABP_FIRST_ONLY=True

if PERFORM_TRACK_VALIDITY_CHECKS and DISPLAY_MALFORMED_ABP:
    malformed_case_ids = pd.read_csv('malformed_tracks_filter.csv', header=None, names=['caseid']).set_index('caseid').index

    for case_id_to_check in malformed_case_ids:
        printAbp(case_id_to_check)
        
        if DISPLAY_MALFORMED_ABP_FIRST_ONLY:
            break

Validate Cases With No Segments Saved¶

In [50]:
DISPLAY_NO_SEGMENTS_CASES=True
DISPLAY_NO_SEGMENTS_CASES_FIRST_ONLY=True

if PERFORM_TRACK_VALIDITY_CHECKS and DISPLAY_NO_SEGMENTS_CASES:
    no_segments_case_ids = [3413, 3476, 3533, 3992, 4328, 4648, 4703, 4733, 5130, 5501, 5693, 5908]

    for case_id_to_check in no_segments_case_ids:
        printAbp(case_id_to_check)
        
        if DISPLAY_NO_SEGMENTS_CASES_FIRST_ONLY:
            break

Select Case For Segment Extraction Validation¶

Generate segment data for one or more cases. Perform a deep analysis of event and segment quality.

In [51]:
# NOTE: This is always set so that if this section of checks is skipped, the model prediction plots will match.
my_cases_of_interest_idx = [84, 198, 60, 16, 27]

# Note: By default, match extract segments processing block above.
# However, regenerate data real time to allow seeing impacts on segment extraction.
# This is why both checkCache and forceWrite are false by default.
positiveSegmentsMap, negativeSegmentsMap, iohEventsMap, cleanEventsMap = None, None, None, None

if PERFORM_TRACK_VALIDITY_CHECKS:
    positiveSegmentsMap, negativeSegmentsMap, iohEventsMap, cleanEventsMap = \
        extract_segments(my_cases_of_interest_idx, debug=False,
                         checkCache=False, forceWrite=False, returnSegments=True,
                         skipInvalidCleanEvents=SKIP_INVALID_CLEAN_EVENTS,
                         skipInvalidIohEvents=SKIP_INVALID_IOH_EVENTS)

Select a specific case to perform detailed low level analysis.

In [52]:
case_id_to_check = my_cases_of_interest_idx[0]
print(case_id_to_check)
print()

if PERFORM_TRACK_VALIDITY_CHECKS:
    print((
        len(positiveSegmentsMap[case_id_to_check]),
        len(negativeSegmentsMap[case_id_to_check]),
        len(iohEventsMap[case_id_to_check]),
        len(cleanEventsMap[case_id_to_check])
    ))
84

In [53]:
if PERFORM_TRACK_VALIDITY_CHECKS:
    printAbp(case_id_to_check)

Positive Events for Case - IOH Events¶

Used to define the range in front of which positive segments will be extracted. Positive samples happen in front of this region.

In [54]:
tmp_abp = None

if PERFORM_TRACK_VALIDITY_CHECKS:
    tmp_vf_path = f'{VITAL_MINI}/{case_id_to_check:04d}_mini.vital'
    tmp_vf = vitaldb.VitalFile(tmp_vf_path)
    tmp_abp = tmp_vf.to_numpy(TRACK_NAMES[0], 1/500)
In [55]:
if PERFORM_TRACK_VALIDITY_CHECKS:
    printEvents(tmp_abp, iohEventsMap, case_id_to_check, 'IOH Event Segment', normalize=False)

Negative Events for Case - Non-IOH Events¶

Used to define the range from in which negative segments will be extracted. Negative samples happen within this region.

In [56]:
if PERFORM_TRACK_VALIDITY_CHECKS:
    printEvents(tmp_abp, cleanEventsMap, case_id_to_check, 'Clean Event Segment', normalize=False)

Positive Segments for Case - IOH Events Predicted Using These¶

One minute regions sampled and used for training the model for "positive" events.

In [57]:
if PERFORM_TRACK_VALIDITY_CHECKS:
    printSegments(positiveSegmentsMap, case_id_to_check, 'Positive Segment - IOH Event', normalize=False)

Negative Segments for Case - Non-IOH Events Predicted Using These¶

One minute regions sampled and used for training the model for "negative" events.

In [58]:
if PERFORM_TRACK_VALIDITY_CHECKS:
    printSegments(negativeSegmentsMap, case_id_to_check, 'Negative Segment - Non-Event', normalize=False)

Overlay Plot of All Events and Segments Extracted¶

For each of the cases in my_cases_of_interest_idx overlay the results of event and segment extraction.

In [59]:
DISPLAY_OVERLAY_CHECK_ABP=True
DISPLAY_OVERLAY_CHECK_ABP_FIRST_ONLY=True

if PERFORM_TRACK_VALIDITY_CHECKS and DISPLAY_OVERLAY_CHECK_ABP:
    for case_id_to_check in my_cases_of_interest_idx:
        printAbpOverlay(case_id_to_check, positiveSegmentsMap, 
                        negativeSegmentsMap, iohEventsMap, cleanEventsMap, movingAverage=True)
        
        if DISPLAY_OVERLAY_CHECK_ABP_FIRST_ONLY:
            break
In [60]:
# free memory
del tmp_abp

Generate Train/Val/Test Splits¶

In [61]:
def get_segment_attributes_from_filename(file_path):
    pieces = os.path.basename(file_path).split('_')
    case = int(pieces[0])
    startX = int(pieces[1])
    predWindow = int(pieces[2])
    label = pieces[3].replace('.h5', '')
    return (case, startX, predWindow, label)
In [62]:
count_negative_samples = 0
count_positive_samples = 0

samples = []

from glob import glob
seg_folder = f"{VITAL_EXTRACTED_SEGMENTS}"
filenames = [y for x in os.walk(seg_folder) for y in glob(os.path.join(x[0], '*.h5'))]

for filename in filenames:
    (case, start_x, pred_window, label) = get_segment_attributes_from_filename(filename)
    #print((case, start_x, pred_window, label))
    
    # only load cases for cases of interest; this folder could have segments for hundreds of cases
    if case not in cases_of_interest_idx:
        continue

    #PREDICTION_WINDOW = 3
    if pred_window == 0 or pred_window == PREDICTION_WINDOW or PREDICTION_WINDOW == 'ALL':
        #print((case, start_x, pred_window, label))
        if label == 'True':
            count_positive_samples += 1
        else:
            count_negative_samples += 1
        sample = (filename, label)
        samples.append(sample)

print()
print(f"samples loaded:         {len(samples):5} ")
print(f'count negative samples: {count_negative_samples:5}')
print(f'count positive samples: {count_positive_samples:5}')
samples loaded:         19324 
count negative samples: 13991
count positive samples:  5333
In [63]:
# Divide by cases
sample_cases = defaultdict(lambda: []) 

for fn, _ in samples:
    (case, start_x, pred_window, label) = get_segment_attributes_from_filename(fn)
    sample_cases[case].append((fn, label))

# understand any missing cases of interest
sample_cases_idx = pd.Index(sample_cases.keys())
missing_case_ids = cases_of_interest_idx.difference(sample_cases_idx)
print(f'cases with no samples: {missing_case_ids.shape[0]}')
print(f'    {missing_case_ids}')
print()
    
# Split data into training, validation, and test sets
# Use 6:1:3 ratio and prevent samples from a single case from being split across different sets
# Note: number of samples at each time point is not the same, because the first event can occur before the 3/5/10/15 minute mark

# Set target sizes
train_ratio = 0.6
val_ratio = 0.1
test_ratio = 1 - train_ratio - val_ratio # ensure ratios sum to 1

# Split samples into train and other
sample_cases_train, sample_cases_other = train_test_split(list(sample_cases.keys()), test_size=(1 - train_ratio), random_state=RANDOM_SEED)

# Split other into val and test
sample_cases_val, sample_cases_test = train_test_split(sample_cases_other, test_size=(test_ratio / (1 - train_ratio)), random_state=RANDOM_SEED)

# Check how many samples are in each set
print(f'Train/Val/Test Summary by Cases')
print(f"Train cases:  {len(sample_cases_train):5}, ({len(sample_cases_train) / len(sample_cases):.2%})")
print(f"Val cases:    {len(sample_cases_val):5}, ({len(sample_cases_val) / len(sample_cases):.2%})")
print(f"Test cases:   {len(sample_cases_test):5}, ({len(sample_cases_test) / len(sample_cases):.2%})")
print(f"Total cases:  {(len(sample_cases_train) + len(sample_cases_val) + len(sample_cases_test)):5}")
cases with no samples: 34
    Index([ 149,  268,  561,  641,  864,  979, 1158, 1174, 1317, 1600, 1957, 2158,
       2221, 2224, 2413, 2830, 2859, 3112, 3596, 3648, 3868, 4380, 4485, 4755,
       4783, 5080, 5204, 5266, 5755, 5782, 5871, 6275, 6331, 6360],
      dtype='int64')

Train/Val/Test Summary by Cases
Train cases:   1637, (59.99%)
Val cases:      272, (9.97%)
Test cases:     820, (30.05%)
Total cases:   2729
In [64]:
sample_cases_train = set(sample_cases_train)
sample_cases_val = set(sample_cases_val)
sample_cases_test = set(sample_cases_test)

samples_train = []
samples_val = []
samples_test = []

for cid, segs in sample_cases.items():
    if cid in sample_cases_train:
        for seg in segs:
            samples_train.append(seg)
    if cid in sample_cases_val:
        for seg in segs:
            samples_val.append(seg)
    if cid in sample_cases_test:
        for seg in segs:
            samples_test.append(seg)
            
# Check how many samples are in each set
print(f'Train/Val/Test Summary by Events')
print(f"Train events:  {len(samples_train):5}, ({len(samples_train) / len(samples):.2%})")
print(f"Val events:    {len(samples_val):5}, ({len(samples_val) / len(samples):.2%})")
print(f"Test events:   {len(samples_test):5}, ({len(samples_test) / len(samples):.2%})")
print(f"Total events:  {(len(samples_train) + len(samples_val) + len(samples_test)):5}")
Train/Val/Test Summary by Events
Train events:  11665, (60.37%)
Val events:     1979, (10.24%)
Test events:    5680, (29.39%)
Total events:  19324

Validate train/val/test Splits¶

In [65]:
PRINT_ALL_CASE_SPLIT_DETAILS = False

case_to_sample_distribution = defaultdict(lambda: {'train': [0, 0], 'val': [0, 0], 'test': [0, 0]})

def populate_case_to_sample_distribution(mysamples, idx):
    neg = 0
    pos = 0
    
    for fn, _ in mysamples:
        (case, start_x, pred_window, label) = get_segment_attributes_from_filename(fn)
        slot = 0 if label == 'False' else 1
        case_to_sample_distribution[case][idx][slot] += 1
        if slot == 0:
            neg += 1
        else:
            pos += 1
                
    return (neg, pos)

train_neg, train_pos = populate_case_to_sample_distribution(samples_train, 'train')
val_neg, val_pos     = populate_case_to_sample_distribution(samples_val,   'val')
test_neg, test_pos   = populate_case_to_sample_distribution(samples_test,  'test')

print(f'Total Cases Present: {len(case_to_sample_distribution):5}')
print()

train_tot = train_pos + train_neg
val_tot = val_pos + val_neg
test_tot = test_pos + test_neg
print(f'Train: P: {train_pos:5} ({(train_pos/train_tot):.2}), N: {train_neg:5} ({(train_neg/train_tot):.2})')
print(f'Val:   P: {val_pos:5} ({(val_pos/val_tot):.2}), N: {val_neg:5} ({(val_neg/val_tot):.2})')
print(f'Test:  P: {test_pos:5} ({(test_pos/test_tot):.2}), N: {test_neg:5}  ({(test_neg/test_tot):.2})')
print()

total_pos = train_pos + val_pos + test_pos
total_neg = train_neg + val_neg + test_neg
total = total_pos + total_neg
print(f'P/N Ratio: {(total_pos)}:{(total_neg)}')
print(f'P Percent: {(total_pos/total):.2}')
print(f'N Percent: {(total_neg/total):.2}')
print()

if PRINT_ALL_CASE_SPLIT_DETAILS:
    for ci in sorted(case_to_sample_distribution.keys()):
        print(f'{ci}: {case_to_sample_distribution[ci]}')
Total Cases Present:  2729

Train: P:  3285 (0.28), N:  8380 (0.72)
Val:   P:   561 (0.28), N:  1418 (0.72)
Test:  P:  1487 (0.26), N:  4193  (0.74)

P/N Ratio: 5333:13991
P Percent: 0.28
N Percent: 0.72

In [66]:
def check_data_leakage(full_data, train_data, val_data, test_data):
    # Convert to sets for easier operations
    full_data_set = set(full_data)
    train_data_set = set(train_data)
    val_data_set = set(val_data)
    test_data_set = set(test_data)

    # Check if train, val, test are subsets of full_data
    if not train_data_set.issubset(full_data_set):
        return "Train data has leakage"
    if not val_data_set.issubset(full_data_set):
        return "Validation data has leakage"
    if not test_data_set.issubset(full_data_set):
        return "Test data has leakage"

    # Check if train, val, test are disjoint
    if train_data_set & val_data_set:
        return "Train and validation data are not disjoint"
    if train_data_set & test_data_set:
        return "Train and test data are not disjoint"
    if val_data_set & test_data_set:
        return "Validation and test data are not disjoint"

    return "No data leakage detected"

# Usage
print(check_data_leakage(list(sample_cases.keys()), sample_cases_train, sample_cases_val, sample_cases_test))
No data leakage detected
In [67]:
# Create vitalDataset class
class vitalDataset(Dataset):
    def __init__(self, samples, normalize_abp=False):
        self.samples = samples
        self.normalize_abp = normalize_abp

    def __len__(self):
        return len(self.samples)

    def __getitem__(self, idx):
        # Get metadata for this event
        segment = self.samples[idx]

        file_path = segment[0]
        label = (segment[1] == "True" or segment[1] == "True.vital")

        (abp, ecg, eeg) = get_segment_data(file_path)

        if abp is None or eeg is None or ecg is None:
            return (np.zeros(30000), np.zeros(30000), np.zeros(7680), 0)
        
        if self.normalize_abp:
            abp -= 65
            abp /= 65

        return abp, ecg, eeg, label
In [68]:
NORMALIZE_ABP = False

train_dataset = vitalDataset(samples_train, NORMALIZE_ABP)
val_dataset = vitalDataset(samples_val, NORMALIZE_ABP)
test_dataset = vitalDataset(samples_test, NORMALIZE_ABP)

train/val/test Splits Summary Statistics¶

In [69]:
def generate_nan_means(mydataset):
    xs = np.zeros(len(mydataset))
    ys = np.zeros(len(mydataset), dtype=int)

    for i, (abp, ecg, eeg, y) in enumerate(iter(mydataset)):
        xs[i] = np.nanmean(abp)
        ys[i] = int(y)

    return pd.DataFrame({'abp_nanmean': xs, 'label': ys})
In [70]:
def generate_nan_means_summaries(tr, va, te, group='all'):
    if group == 'all':
        return pd.DataFrame({
            'train': tr.describe()['abp_nanmean'],
            'validation': va.describe()['abp_nanmean'],
            'test': te.describe()['abp_nanmean']
        })
    
    mytr = tr.reset_index()
    myva = va.reset_index()
    myte = te.reset_index()
    
    label_flag = True if group == 'positive' else False
    
    return pd.DataFrame({
        'train':      mytr[mytr['label'] == label_flag].describe()['abp_nanmean'],
        'validation': myva[myva['label'] == label_flag].describe()['abp_nanmean'],
        'test':       myte[myte['label'] == label_flag].describe()['abp_nanmean']
    })
In [71]:
def plot_nan_means(df, plot_label):
    mydf = df.reset_index()

    maxCases = 'ALL' if MAX_CASES is None else MAX_CASES
    plot_title = f'{plot_label} - ABP nanmean Values, {PREDICTION_WINDOW} Minutes, {maxCases} Cases'
    
    ax = mydf[mydf['label'] == False].plot.scatter(
        x='index', y='abp_nanmean', color='DarkBlue', label='Negative', 
        title=plot_title, figsize=(16,9))

    negative_median = mydf[mydf['label'] == False]['abp_nanmean'].median()
    ax.axhline(y=negative_median, color='DarkBlue', linestyle='--', label='Negative Median')
    
    mydf[mydf['label'] == True].plot.scatter(
        x='index', y='abp_nanmean', color='DarkOrange', label='Positive', ax=ax);
    
    positive_median = mydf[mydf['label'] == True]['abp_nanmean'].median()
    ax.axhline(y=positive_median, color='DarkOrange', linestyle='--', label='Positive Median')
    
    ax.legend(loc='upper right')
In [72]:
def plot_nan_means_hist(df):
    df.plot.hist(column=['abp_nanmean'], by='label', bins=50, figsize=(10, 8));
In [73]:
train_abp_nanmeans = generate_nan_means(train_dataset)
val_abp_nanmeans = generate_nan_means(val_dataset)
test_abp_nanmeans = generate_nan_means(test_dataset)

ABP Nanmean Summaries¶

In [74]:
generate_nan_means_summaries(train_abp_nanmeans, val_abp_nanmeans, test_abp_nanmeans)
Out[74]:
train validation test
count 11665.000000 1979.000000 5680.000000
mean 85.280193 85.112286 85.286773
std 12.280448 11.551003 11.841797
min 65.136129 65.367918 65.154759
25% 75.612491 76.251820 76.134000
50% 83.419197 83.649439 83.697692
75% 93.314144 92.505598 93.093587
max 138.285504 147.949437 136.381225
In [75]:
generate_nan_means_summaries(train_abp_nanmeans, val_abp_nanmeans, test_abp_nanmeans, group='positive')
Out[75]:
train validation test
count 3285.000000 561.000000 1487.000000
mean 76.363533 76.421972 76.353014
std 9.231705 8.731134 9.058046
min 65.136129 65.367918 65.154759
25% 69.914073 69.981088 70.123858
50% 73.991152 74.319745 74.215963
75% 79.909642 80.241065 79.848930
max 132.202888 122.935320 136.381225
In [76]:
generate_nan_means_summaries(train_abp_nanmeans, val_abp_nanmeans, test_abp_nanmeans, group='negative')
Out[76]:
train validation test
count 8380.000000 1418.000000 4193.000000
mean 88.775566 88.550414 88.455030
std 11.538735 10.695512 11.069511
min 65.225560 66.473179 65.476802
25% 79.990967 80.760238 80.078761
50% 87.414185 86.901416 87.140528
75% 96.060508 95.399991 95.449323
max 138.285504 147.949437 130.780501

ABP Nanmean Histograms¶

In [77]:
plot_nan_means_hist(train_abp_nanmeans)
In [78]:
plot_nan_means_hist(val_abp_nanmeans)
In [79]:
plot_nan_means_hist(test_abp_nanmeans)

ABP Nanmean Scatter Plots¶

In [80]:
plot_nan_means(train_abp_nanmeans, 'Train')
In [81]:
plot_nan_means(val_abp_nanmeans, 'Validation')
In [82]:
plot_nan_means(test_abp_nanmeans, 'Test')
In [83]:
# Cleanup
del train_abp_nanmeans
del val_abp_nanmeans
del test_abp_nanmeans

Classification Studies¶

Check if data can be easily classified using non-deep learning methods. Create a balanced sample of IOH and non-IOH events and use a simple classifier to see if the data can be easily separated. Datasets which can be easily separated by non-deep learning methods should also be easily classified by deep learning models.

In [84]:
MAX_CLASSIFICATION_SAMPLES = 250
MAX_SAMPLE_SIZE = 1600
classification_sample_size = MAX_SAMPLE_SIZE if len(samples) >= MAX_SAMPLE_SIZE else len(samples)

classification_samples = random.sample(samples, classification_sample_size)

positive_samples = []
negative_samples = []

for sample in classification_samples:
    (sampleAbp, sampleEcg, sampleEeg) = get_segment_data(sample[0])
    
    if sample[1] == "True":
        positive_samples.append([sample[0], True, sampleAbp, sampleEcg, sampleEeg])
    else:
        negative_samples.append([sample[0], False, sampleAbp, sampleEcg, sampleEeg])

positive_samples = pd.DataFrame(positive_samples, columns=["file_path", "segment_label", "segment_abp", "segment_ecg", "segment_eeg"])
negative_samples = pd.DataFrame(negative_samples, columns=["file_path", "segment_label", "segment_abp", "segment_ecg", "segment_eeg"])

total_to_sample_pos = MAX_CLASSIFICATION_SAMPLES if len(positive_samples) >= MAX_CLASSIFICATION_SAMPLES else len(positive_samples)
total_to_sample_neg = MAX_CLASSIFICATION_SAMPLES if len(negative_samples) >= MAX_CLASSIFICATION_SAMPLES else len(negative_samples)

# Select up to 150 random samples where segment_label is True
positive_samples = positive_samples.sample(total_to_sample_pos, random_state=RANDOM_SEED)
# Select up to 150 random samples where segment_label is False
negative_samples = negative_samples.sample(total_to_sample_neg, random_state=RANDOM_SEED)

print(f'positive_samples: {len(positive_samples)}')
print(f'negative_samples: {len(negative_samples)}')

# Combine the positive and negative samples
samples_balanced = pd.concat([positive_samples, negative_samples])
positive_samples: 250
negative_samples: 250

Define function to build data for study. Each waveform field can be enabled or disabled:

In [85]:
def get_x_y(samples, use_abp, use_ecg, use_eeg):
    # Create X and y, using data from `samples_balanced` and the `use_abp`, `use_ecg`, and `use_eeg` variables
    X = []
    y = []
    for i in range(len(samples)):
        row = samples.iloc[i]
        sample = np.array([])
        if use_abp:
            if len(row['segment_abp']) != 30000:
                print(len(row['segment_abp']))
            sample = np.append(sample, row['segment_abp'])
        if use_ecg:
            if len(row['segment_ecg']) != 30000:
                print(len(row['segment_ecg']))
            sample = np.append(sample, row['segment_ecg'])
        if use_eeg:
            if len(row['segment_eeg']) != 7680:
                print(len(row['segment_eeg']))
            sample = np.append(sample, row['segment_eeg'])
        X.append(sample)
        # Convert the label from boolean to 0 or 1
        y.append(int(row['segment_label']))
    return X, y

KNN¶

Define KNN run. This is configurable to enable or disable different data channels so that we can study them individually or together:

In [86]:
N_NEIGHBORS = 20

def run_knn(samples, use_abp, use_ecg, use_eeg):
    # Get samples
    X,y = get_x_y(samples, use_abp, use_ecg, use_eeg)

    # Split samples into train and val
    knn_X_train, knn_X_test, knn_y_train, knn_y_test = train_test_split(X, y, test_size=0.2, random_state=RANDOM_SEED)

    # Normalize the data
    scaler = StandardScaler()
    scaler.fit(knn_X_train)

    knn_X_train = scaler.transform(knn_X_train)
    knn_X_test = scaler.transform(knn_X_test)

    # Initialize the KNN classifier
    knn = KNeighborsClassifier(n_neighbors=N_NEIGHBORS)

    # Train the KNN classifier
    knn.fit(knn_X_train, knn_y_train)

    # Make predictions on the test set
    knn_y_pred = knn.predict(knn_X_test)

    # Evaluate the KNN classifier
    print(f"ABP: {use_abp}, ECG: {use_ecg}, EEG: {use_eeg}")
    print(f"Confusion matrix:\n{confusion_matrix(knn_y_test, knn_y_pred)}")
    print(f"Classification report:\n{classification_report(knn_y_test, knn_y_pred)}")

Study each waveform independently, then ABP+EEG (which had best results in paper), and ABP+ECG+EEG:

In [87]:
run_knn(samples_balanced, use_abp=True, use_ecg=False, use_eeg=False)
run_knn(samples_balanced, use_abp=False, use_ecg=True, use_eeg=False)
run_knn(samples_balanced, use_abp=False, use_ecg=False, use_eeg=True)
run_knn(samples_balanced, use_abp=True, use_ecg=False, use_eeg=True)
run_knn(samples_balanced, use_abp=True, use_ecg=True, use_eeg=True)
ABP: True, ECG: False, EEG: False
Confusion matrix:
[[48  6]
 [20 26]]
Classification report:
              precision    recall  f1-score   support

           0       0.71      0.89      0.79        54
           1       0.81      0.57      0.67        46

    accuracy                           0.74       100
   macro avg       0.76      0.73      0.73       100
weighted avg       0.75      0.74      0.73       100

ABP: False, ECG: True, EEG: False
Confusion matrix:
[[32 22]
 [21 25]]
Classification report:
              precision    recall  f1-score   support

           0       0.60      0.59      0.60        54
           1       0.53      0.54      0.54        46

    accuracy                           0.57       100
   macro avg       0.57      0.57      0.57       100
weighted avg       0.57      0.57      0.57       100

ABP: False, ECG: False, EEG: True
Confusion matrix:
[[ 6 48]
 [ 6 40]]
Classification report:
              precision    recall  f1-score   support

           0       0.50      0.11      0.18        54
           1       0.45      0.87      0.60        46

    accuracy                           0.46       100
   macro avg       0.48      0.49      0.39       100
weighted avg       0.48      0.46      0.37       100

ABP: True, ECG: False, EEG: True
Confusion matrix:
[[42 12]
 [17 29]]
Classification report:
              precision    recall  f1-score   support

           0       0.71      0.78      0.74        54
           1       0.71      0.63      0.67        46

    accuracy                           0.71       100
   macro avg       0.71      0.70      0.71       100
weighted avg       0.71      0.71      0.71       100

ABP: True, ECG: True, EEG: True
Confusion matrix:
[[34 20]
 [12 34]]
Classification report:
              precision    recall  f1-score   support

           0       0.74      0.63      0.68        54
           1       0.63      0.74      0.68        46

    accuracy                           0.68       100
   macro avg       0.68      0.68      0.68       100
weighted avg       0.69      0.68      0.68       100

Based on the data above, the ABP data alone is strongly predictive based on the macro average F1-score of 0.90. The ECG and EEG data are weakly predictive with F1 scores of 0.33 and 0.64, respectively. The ABP+EEG data is also strongly predictive with an F1 score of 0.88, and ABP+ECG+EEG data somewhat predictive with an F1 score of 0.79.

Models based on ABP data alone, or ABP+EEG data are expected to train easily with good performance. The other signals appear to mostly add noise and are not strongly predictive. This agrees with the results from the paper.

t-SNE¶

Define t-SNE run. This is configurable to enable or disable different data channels so that we can study them individually or together:

In [88]:
def run_tsne(samples, use_abp, use_ecg, use_eeg):
    # Get samples
    X,y = get_x_y(samples, use_abp, use_ecg, use_eeg)
    
    # Convert X and y to numpy arrays
    X = np.array(X)
    y = np.array(y)

    # Run t-SNE on the samples
    tsne = TSNE(n_components=len(np.unique(y)), random_state=RANDOM_SEED)
    X_tsne = tsne.fit_transform(X)
    
    # Create a scatter plot of the t-SNE representation
    plt.figure(figsize=(16, 9))
    plt.title(f"use_abp={use_abp}, use_ecg={use_ecg}, use_eeg={use_eeg}")
    for i, label in enumerate(set(y)):
        plt.scatter(X_tsne[y == label, 0], X_tsne[y == label, 1], label=label)
    plt.legend()
    plt.show()

Study each waveform independently, then ABP+EEG (which had best results in paper), and ABP+ECG+EEG:

In [89]:
run_tsne(samples_balanced, use_abp=True, use_ecg=False, use_eeg=False)
run_tsne(samples_balanced, use_abp=False, use_ecg=True, use_eeg=False)
run_tsne(samples_balanced, use_abp=False, use_ecg=False, use_eeg=True)
run_tsne(samples_balanced, use_abp=True, use_ecg=False, use_eeg=True)
run_tsne(samples_balanced, use_abp=True, use_ecg=True, use_eeg=True)

Based on the plots above, it appears that ABP alone, ABP+EEG and ABP+ECG+EEG are somewhat separable, though with outliers, and should be trainable by our model. The ECG and EEG data are not easily separable from the other data. This agrees with the results from the paper.

In [90]:
# cleanup
del samples_balanced

Model¶

The model implementation is based on the CNN architecture described in Jo Y-Y et al. (2022). It is designed to handle 1, 2, or 3 signal categories simultaneously, allowing for flexible model configurations based on different combinations of physiological signals:

  • ABP alone
  • EEG alone
  • ECG alone
  • ABP + EEG
  • ABP + ECG
  • EEG + ECG
  • ABP + EEG + ECG

Model Architecture¶

The architecture, as depicted in Figure 2 from the original paper, utilizes a ResNet-based approach tailored for time-series data from different physiological signals. The model architecture is adapted to handle varying input signal frequencies, with specific hyperparameters for each signal type, particularly EEG, due to its distinct characteristics compared to ABP and ECG. A diagram of the model architecture is shown below:

Architecture of the hypotension risk prediction model using multiple waveforms

Each input signal is processed through a sequence of 12 7-layer residual blocks, followed by a flattening process and a linear transformation to produce a 32-dimensional feature vector per signal type. These vectors are then concatenated (if multiple signals are used) and passed through two additional linear layers to produce a single output vector, representing the IOH index. A threshold is determined experimentally in order to minimize the differene between the sensitivity and specificity and is applied to this index to perform binary classification for predicting IOH events.

The hyperparameters for the residual blocks are specified in Supplemental Table 1 from the original paper and vary for different signal type.

A forward pass through the model passes through 85 layers before concatenation, followed by two more linear layers and finally a sigmoid activation layer to produce the prediction measure.

Residual Block Definition¶

Each residual block consists of the following seven layers:

  • Batch normalization
  • ReLU
  • Dropout (0.5)
  • 1D convolution
  • Batch normalization
  • ReLU
  • 1D convolution

Skip connections are included to aid in gradient flow during training, with optional 1D convolution in the skip connection to align dimensions.

Residual Block Hyperparameters¶

The hyperparameters are detailed in Supplemental Table 1 of the original paper. A screenshot of these hyperparameters is provided for reference below:

Supplemental Table 1 from original paper

Note: Please be aware of a transcription error in the original paper's Supplemental Table 1 for the ECG+ABP configuration in Residual Blocks 11 and 12, where the output size should be 469 6 instead of the reported 496 6.

Training Objectives¶

Our model uses binary cross entropy as the loss function and Adam as the optimizer, consistent with the original study. The learning rate is set at 0.0001, and training is configured to run for up to 100 epochs, with early stopping implemented if no improvement in loss is observed over five consecutive epochs.

In [91]:
# First define the residual block which is reused 12x for each data track for each sample.
# Second define the primary model.
class ResidualBlock(nn.Module):
    def __init__(self, in_features: int, out_features: int, in_channels: int, out_channels: int, kernel_size: int, stride: int = 1, size_down: bool = False, ignoreSkipConnection: bool = False) -> None:
        super(ResidualBlock, self).__init__()
        
        self.ignoreSkipConnection = ignoreSkipConnection

        # calculate the appropriate padding required to ensure expected sequence lengths out of each residual block
        padding = int((((stride-1)*in_features)-stride+kernel_size)/2)

        self.size_down = size_down
        self.bn1 = nn.BatchNorm1d(in_channels)
        self.relu = nn.ReLU()
        self.dropout = nn.Dropout(0.5)
        self.conv1 = nn.Conv1d(in_channels, out_channels, kernel_size=kernel_size, stride=1, padding=padding, bias=False)
        self.bn2 = nn.BatchNorm1d(out_channels)
        self.conv2 = nn.Conv1d(out_channels, out_channels, kernel_size=kernel_size, stride=1, padding=padding, bias=False)
        
        self.residualConv = nn.Conv1d(in_channels, out_channels, kernel_size=kernel_size, stride=1, padding=padding, bias=False)

        # unclear where in sequence this should take place. Size down expressed in Supplemental table S1
        if self.size_down:
            pool_padding = (1 if (in_features % 2 > 0) else 0)
            self.downsample = nn.MaxPool1d(kernel_size=2, stride=2, padding = pool_padding)
        
    def forward(self, x: torch.Tensor) -> torch.Tensor:
        identity = x
        
        out = self.bn1(x)
        out = self.relu(out)
        out = self.dropout(out)
        out = self.conv1(out)

        if self.size_down:
            out = self.downsample(out)

        out = self.bn2(out)
        out = self.relu(out)
        out = self.conv2(out)
        
        if not self.ignoreSkipConnection:
          if out.shape != identity.shape:
              # run the residual through a convolution when necessary
              identity = self.residualConv(identity)
            
              outlen = np.prod(out.shape)
              idlen = np.prod(identity.shape)
              # downsample when required
              if idlen > outlen:
                  identity = self.downsample(identity)
              # match dimensions
              identity = identity.reshape(out.shape)

          # add the residual       
          out += identity

        return  out

class HypotensionCNN(nn.Module):
    def __init__(self, useAbp: bool = True, useEeg: bool = False, useEcg: bool = False, device: str = "cpu", nResiduals: int = 12, ignoreSkipConnection: bool = False, useSigmoid: bool = True) -> None:
        assert useAbp or useEeg or useEcg, "At least one data track must be used"
        assert nResiduals > 0 and nResiduals <= 12, "Number of residual blocks must be between 1 and 12"
        super(HypotensionCNN, self).__init__()

        self.device = device

        self.useAbp = useAbp
        self.useEeg = useEeg
        self.useEcg = useEcg
        self.nResiduals = nResiduals
        self.useSigmoid = useSigmoid

        # Size of the concatenated output from the residual blocks
        concatSize = 0

        if useAbp:
          self.abpBlocks = []
          self.abpMultipliers = [1, 2, 2, 2, 2, 2, 4, 4, 4, 4, 4, 6, 6]
          self.abpSizes = [30000, 15000, 15000, 7500, 7500, 3750, 3750, 1875, 1875, 938, 938, 469, 469]
          for i in range(self.nResiduals):
            downsample = i % 2 == 0
            self.abpBlocks.append(ResidualBlock(self.abpSizes[i], self.abpSizes[i+1], self.abpMultipliers[i], self.abpMultipliers[i+1], 15 if i < 6 else 7, 1, downsample, ignoreSkipConnection))
          self.abpResiduals = nn.Sequential(*self.abpBlocks)
          self.abpFc = nn.Linear(self.abpMultipliers[self.nResiduals] * self.abpSizes[self.nResiduals], 32)
          concatSize += 32
        
        if useEcg:
          self.ecgBlocks = []
          self.ecgMultipliers = [1, 2, 2, 2, 2, 2, 4, 4, 4, 4, 4, 6, 6]
          self.ecgSizes = [30000, 15000, 15000, 7500, 7500, 3750, 3750, 1875, 1875, 938, 938, 469, 469]

          for i in range(self.nResiduals):
            downsample = i % 2 == 0
            self.ecgBlocks.append(ResidualBlock(self.ecgSizes[i], self.ecgSizes[i+1], self.ecgMultipliers[i], self.ecgMultipliers[i+1], 15 if i < 6 else 7, 1, downsample, ignoreSkipConnection))
          self.ecgResiduals = nn.Sequential(*self.ecgBlocks)
          self.ecgFc = nn.Linear(self.ecgMultipliers[self.nResiduals] * self.ecgSizes[self.nResiduals], 32)
          concatSize += 32

        if useEeg:
          self.eegBlocks = []
          self.eegMultipliers = [1, 2, 2, 2, 2, 2, 4, 4, 4, 4, 4, 6, 6]
          self.eegSizes = [7680, 3840, 3840, 1920, 1920, 960, 960, 480, 480, 240, 240, 120, 120]

          for i in range(self.nResiduals):
            downsample = i % 2 == 0
            self.eegBlocks.append(ResidualBlock(self.eegSizes[i], self.eegSizes[i+1], self.eegMultipliers[i], self.eegMultipliers[i+1], 7 if i < 6 else 3, 1, downsample, ignoreSkipConnection))
          self.eegResiduals = nn.Sequential(*self.eegBlocks)
          self.eegFc = nn.Linear(self.eegMultipliers[self.nResiduals] * self.eegSizes[self.nResiduals], 32)
          concatSize += 32

        self.fullLinear1 = nn.Linear(concatSize, 16)
        self.fullLinear2 = nn.Linear(16, 1)
        self.sigmoid = nn.Sigmoid()


    def forward(self, abp: torch.Tensor, eeg: torch.Tensor, ecg: torch.Tensor) -> torch.Tensor:
        batchSize = len(abp)

        # conditionally operate ABP, EEG, and ECG networks
        tensors = []
        if self.useAbp:
          self.abpResiduals.to(self.device)
          abp = self.abpResiduals(abp)
          totalLen = np.prod(abp.shape)
          abp = torch.reshape(abp, (batchSize, int(totalLen / batchSize)))
          abp = self.abpFc(abp)
          tensors.append(abp)

        if self.useEeg:
          self.eegResiduals.to(self.device)
          eeg = self.eegResiduals(eeg)
          totalLen = np.prod(eeg.shape)
          eeg = torch.reshape(eeg, (batchSize, int(totalLen / batchSize)))
          eeg = self.eegFc(eeg)
          tensors.append(eeg)
        
        if self.useEcg:
          self.ecgResiduals.to(self.device)
          ecg = self.ecgResiduals(ecg)
          totalLen = np.prod(ecg.shape)
          ecg = torch.reshape(ecg, (batchSize, int(totalLen / batchSize)))
          ecg = self.ecgFc(ecg)
          tensors.append(ecg)

        # concatenate the tensors along dimension 1 if there's more than one, otherwise use the single tensor
        merged = torch.cat(tensors, dim=1) if len(tensors) > 1 else tensors[0]

        totalLen = np.prod(merged.shape)
        merged = torch.reshape(merged, (batchSize, int(totalLen / batchSize)))
        out = self.fullLinear1(merged)
        out = self.fullLinear2(out)
        if self.useSigmoid:
            out = self.sigmoid(out)

        # We should not be seeing NaNs! If we are, there is a problem upstream.
        #out = torch.nan_to_num(out)
        return out

Training¶

As discussed earlier, our model uses binary cross entropy as the loss function and Adam as the optimizer, consistent with the original study. The learning rate is set at 0.0001, and training is configured to run for up to 100 epochs, with early stopping implemented if no improvement in loss is observed over five consecutive epochs.

In [92]:
def train_model_one_iter(model, device, loss_func, optimizer, train_loader):
    model.train()
    train_losses = []
    
    for abp, ecg, eeg, label in tqdm(train_loader):
        batch = len(abp)
        abp = abp.reshape(batch, 1, -1).type(torch.FloatTensor).to(device)
        ecg = ecg.reshape(batch, 1, -1).type(torch.FloatTensor).to(device)
        eeg = eeg.reshape(batch, 1, -1).type(torch.FloatTensor).to(device)
        label = label.type(torch.float).reshape(batch, 1).to(device)

        optimizer.zero_grad()
        mdl = model(abp, eeg, ecg)
        loss = loss_func(torch.nan_to_num(mdl), label)
        loss.backward()
        optimizer.step()
        train_losses.append(loss.cpu().data.numpy())
    return np.mean(train_losses)
In [93]:
def evaluate_model(model, loss_func, val_loader):
    model.eval()
    val_losses = []
    for abp, ecg, eeg, label in tqdm(val_loader):
        batch = len(abp)

        abp = abp.reshape(batch, 1, -1).type(torch.FloatTensor).to(device)
        ecg = ecg.reshape(batch, 1, -1).type(torch.FloatTensor).to(device)
        eeg = eeg.reshape(batch, 1, -1).type(torch.FloatTensor).to(device)
        label = label.type(torch.float).reshape(batch, 1).to(device)

        mdl = model(abp, eeg, ecg)
        loss = loss_func(torch.nan_to_num(mdl), label)
        val_losses.append(loss.cpu().data.numpy())
    return np.mean(val_losses)
In [94]:
def plot_losses(train_losses, val_losses, best_epoch, experimentName):
    print()
    print(f'Plot Validation and Loss Values from Training')
    print(f'  Epoch with best Validation Loss:  {best_epoch:3}, {val_losses[best_epoch]:.4}')

    # Create x-axis values for epochs
    epochs = range(0, len(train_losses))

    plt.figure(figsize=(16, 9))

    # Plot the training and validation losses
    plt.plot(epochs, train_losses, 'b', label='Training Loss')
    plt.plot(epochs, val_losses, 'r', label='Validation Loss')

    # Add a vertical bar at the best_epoch
    plt.axvline(x=best_epoch, color='g', linestyle='--', label='Best Epoch')

    # Shade everything to the right of the best_epoch a light red
    plt.axvspan(best_epoch, max(epochs), facecolor='r', alpha=0.1)

    # Add labels and title
    plt.xlabel('Epochs')
    plt.ylabel('Loss')
    plt.title(experimentName)

    # Add legend
    plt.legend(loc='upper right')

    # Save plot to disk
    plt.savefig(os.path.join(VITAL_RUNS, f'{experimentName}_losses.png'))

    # Show the plot
    plt.show()
In [95]:
def eval_model(model, device, dataloader, loss_func, print_detailed: bool = False):
    model.eval()
    model = model.to(device)
    total_loss = 0
    all_predictions = []
    all_labels = []

    with torch.no_grad():
        for abp, ecg, eeg, label in tqdm(dataloader):
            batch = len(abp)
    
            abp = torch.nan_to_num(abp.reshape(batch, 1, -1)).type(torch.FloatTensor).to(device)
            ecg = torch.nan_to_num(ecg.reshape(batch, 1, -1)).type(torch.FloatTensor).to(device)
            eeg = torch.nan_to_num(eeg.reshape(batch, 1, -1)).type(torch.FloatTensor).to(device)
            label = label.type(torch.float).reshape(batch, 1).to(device)
   
            pred = model(abp, eeg, ecg)
            loss = loss_func(pred, label)
            total_loss += loss.item()

            all_predictions.append(pred.detach().cpu().numpy())
            all_labels.append(label.detach().cpu().numpy())

    # Flatten the lists
    all_predictions = np.concatenate(all_predictions).flatten()
    all_labels = np.concatenate(all_labels).flatten()

    # Calculate AUROC and AUPRC
    # y_true, y_pred
    auroc = roc_auc_score(all_labels, all_predictions)
    precision, recall, _ = precision_recall_curve(all_labels, all_predictions)
    auprc = auc(recall, precision)

    # Determine the optimal threshold, which is argmin(abs(sensitivity - specificity)) per the paper
    thresholds = np.linspace(0, 1, 101) # 0 to 1 in 0.01 steps
    min_diff = float('inf')
    optimal_sensitivity = None
    optimal_specificity = None
    optimal_threshold = None

    for threshold in thresholds:
        all_predictions_binary = (all_predictions > threshold).astype(int)

        tn, fp, fn, tp = confusion_matrix(all_labels, all_predictions_binary).ravel()
        sensitivity = tp / (tp + fn)
        specificity = tn / (tn + fp)
        diff = abs(sensitivity - specificity)

        if diff < min_diff:
            min_diff = diff
            optimal_threshold = threshold
            optimal_sensitivity = sensitivity
            optimal_specificity = specificity

    avg_loss = total_loss / len(dataloader)
    
    # accuracy
    predictions_binary = (all_predictions > optimal_threshold).astype(int)
    accuracy = np.mean(predictions_binary == all_labels)

    if print_detailed:
        print(f"Predictions: {all_predictions}")
        print(f"Labels: {all_labels}")
    print(f"Loss: {avg_loss}")
    print(f"AUROC: {auroc}")
    print(f"AUPRC: {auprc}")
    print(f"Sensitivity: {optimal_sensitivity}")
    print(f"Specificity: {optimal_specificity}")
    print(f"Threshold: {optimal_threshold}")
    print(f"Accuracy:  {accuracy}")

    return all_predictions, all_labels, avg_loss, auroc, auprc, \
        optimal_sensitivity, optimal_specificity, optimal_threshold, accuracy
In [96]:
def print_all_evals(model, models, device, val_loader, test_loader, loss_func, print_detailed: bool = False):
    print()
    print(f'Generate AUROC/AUPRC for Each Intermediate Model')
    print()
    val_aurocs = []
    val_auprcs = []
    val_accs   = []

    test_aurocs = []
    test_auprcs = []
    test_accs   = []

    for mod in models:
        model.load_state_dict(torch.load(mod))
        #model.train(False)
        model.eval()
        print(f'Intermediate Model:')
        print(f'  {mod}')
    
        # validation loop
        print("AUROC/AUPRC on Validation Data")
        all_predictions, all_labels, avg_loss, valid_auroc, valid_auprc, \
        optimal_sensitivity, optimal_specificity, optimal_threshold, valid_accuracy = \
            eval_model(model, device, val_loader, loss_func, print_detailed)

        val_aurocs.append(valid_auroc)
        val_auprcs.append(valid_auprc)
        val_accs.append(valid_accuracy)
        print()
    
        # test loop
        print("AUROC/AUPRC on Test Data")
        all_predictions, all_labels, avg_loss, test_auroc, test_auprc, \
        optimal_sensitivity, optimal_specificity, optimal_threshold, test_accuracy = \
            eval_model(model, device, test_loader, loss_func, print_detailed)

        test_aurocs.append(test_auroc)
        test_auprcs.append(test_auprc)
        test_accs.append(test_accuracy)
        print()
    
    return val_aurocs, val_auprcs, val_accs, test_aurocs, test_auprcs, test_accs
In [97]:
def plot_auroc_auprc(val_losses, val_aurocs, val_auprcs, val_accs, 
                                      test_aurocs, test_auprcs, test_accs, all_models, best_epoch, experimentName):
    print()
    print(f'Plot AUROC/AUPRC for Each Intermediate Model')
    
    # Create x-axis values for epochs
    epochs = range(0, len(val_aurocs))

    # Find model with highest AUROC
    np_test_aurocs = np.array(test_aurocs)
    test_auroc_idx = np.argmax(np_test_aurocs)
    test_accs_idx  = np.argmax(test_accs)

    print(f'  Epoch with best Validation Loss:     {best_epoch:3}, {val_losses[best_epoch]:.4}')
    print(f'  Epoch with best model Test AUROC:    {test_auroc_idx:3}, {np_test_aurocs[test_auroc_idx]:.4}')
    print(f'  Epoch with best model Test Accuracy: {test_accs_idx:3}, {test_accs[test_accs_idx]:.4}')
    #print(f'Best Model on Validation Loss:')
    #print(f'  {all_models[test_auroc_idx]}')
    #print(f'Best Model on Test AUROC:')
    #print(f'  {all_models[best_epoch]}')
    print()

    plt.figure(figsize=(16, 9))

    # Plots
    plt.plot(epochs, val_aurocs, 'C0', label='AUROC - Validation')
    plt.plot(epochs, test_aurocs, 'C1', label='AUROC - Test')

    plt.plot(epochs, val_auprcs, 'C2', label='AUPRC - Validation')
    plt.plot(epochs, test_auprcs, 'C3', label='AUPRC - Test')
    
    plt.plot(epochs, val_accs, 'C4', label='Accuracy - Validation')
    plt.plot(epochs, test_accs, 'C5', label='Accuracy - Test')

    # Add vertical bars
    plt.axvline(x=best_epoch, color='g', linestyle='--', label='Best Epoch - Validation Loss')
    plt.axvline(x=test_auroc_idx, color='maroon', linestyle='--', label='Best Epoch - Test AUROC')
    plt.axvline(x=test_accs_idx, color='violet', linestyle='--', label='Best Epoch - Test Accuracy')

    # Shade everything to the right of the best_model a light red
    plt.axvspan(test_auroc_idx, max(epochs), facecolor='r', alpha=0.1)

    # Add labels and title
    plt.xlabel('Epochs')
    plt.ylabel('AUROC / AUPRC')
    plt.title('Validation and Test AUROC and AUPRC by Model Iteration Across Training')

    # Add legend
    plt.legend(loc='right')

    # Save plot to disk
    plt.savefig(os.path.join(VITAL_RUNS, f'{experimentName}_all_stats.png'))
    
    # Show the plot
    plt.show()

    return np_test_aurocs, test_auroc_idx
In [98]:
# applies the model to a given real case to generate predictions
def predictionsForModel(case_id_to_check, my_model, my_model_state, device):
    (abp, ecg, eeg, event) = get_track_data(case_id_to_check)
    
    opstart = cases.loc[case_id_to_check]['opstart'].item()
    opend = cases.loc[case_id_to_check]['opend'].item()

    abp = abp[opstart*500:opend*500]
    ecg = ecg[opstart*500:opend*500]
    eeg = eeg[opstart*128:opend*128]
    
    # number of one minute segments in each track
    splits_abp = abp.shape[0] // (60 * 500)
    splits_ecg = ecg.shape[0] // (60 * 500)
    splits_eeg = eeg.shape[0] // (60 * 128)
    
    # predict as long as each track has data in the prediction window
    splits = np.min([splits_abp, splits_ecg, splits_eeg])
    
    #print(splits_abp)
    #print(splits_ecg)
    #print(splits_eeg)
    #print(splits)
    
    preds = []
    
    my_model.load_state_dict(torch.load(my_model_state))
    my_model.eval()
    my_model = my_model.to(device)
    
    for i in range(splits):
        t_abp = abp[i*60*500:(i + 1)*60*500]
        t_ecg = ecg[i*60*500:(i + 1)*60*500]
        t_eeg = eeg[i*60*128:(i + 1)*60*128]
    
        if len(t_abp) < 30000:
            t_abp = np.resize(t_abp, (30000))
            
        if len(t_ecg) < 30000:
            t_ecg = np.resize(t_ecg, (30000))
            
        if len(t_eeg) < 7680:
            t_eeg = np.resize(t_eeg, (7680))
            
        t_abp = torch.from_numpy(t_abp)
        t_ecg = torch.from_numpy(t_ecg)
        t_eeg = torch.from_numpy(t_eeg)
        
        t_abp = torch.nan_to_num(t_abp.reshape(1, 1, -1)).type(torch.FloatTensor).to(device)
        t_ecg = torch.nan_to_num(t_ecg.reshape(1, 1, -1)).type(torch.FloatTensor).to(device)
        t_eeg = torch.nan_to_num(t_eeg.reshape(1, 1, -1)).type(torch.FloatTensor).to(device)

        pred = my_model(t_abp, t_eeg, t_ecg)
        preds.append(pred.detach().cpu().numpy())
    
    return np.concatenate(preds).flatten()
In [99]:
def printModelPrediction(
    case_id_to_check,
    positiveSegmentsMap,
    negativeSegmentsMap,
    iohEventsMap,
    cleanEventsMap,
    preds,
    experimentName
):  
    (abp, ecg, eeg, event) = get_track_data(case_id_to_check)
    
    opstart = cases.loc[case_id_to_check]['opstart'].item()
    opend = cases.loc[case_id_to_check]['opend'].item()
    minutes = (opend - opstart) / 60
    
    plt.figure(figsize=(24, 8))
    plt.margins(0)
    plt.title(f'ABP - Mean Arterial Pressure - Case: {case_id_to_check} - Operating Time: {minutes} minutes')
    plt.axhline(y = 65, color = 'maroon', linestyle = '--')
    
    opstart = opstart * 500
    opend = opend * 500
    
    minute_step = 5
    
    abp_mov_avg = moving_average(abp[opstart:(opend + 60*500)])
    myx = np.arange(opstart, opstart + len(abp_mov_avg), 1)
    plt.plot(myx, abp_mov_avg, 'purple')
    x_ticks = np.arange(opstart, opend, step=minute_step*30000)
    x_labels = [str(i*minute_step) for i in range(len(x_ticks))]
    plt.xticks(x_ticks, labels=x_labels)
    plt.savefig(os.path.join(VITAL_RUNS, f'{experimentName}_surgery_map.png'))
    plt.show()
    
    plt.figure(figsize=(24, 8))
    plt.margins(0)
    plt.title(f'Model Predictions for One Minute Intervals Using {PREDICTION_WINDOW} Minute Prediction Window')
    plt.plot(preds)
    x_ticks = np.arange(0, len(preds), step=minute_step)
    x_labels = [str(i*minute_step) for i in range(len(x_ticks))]
    plt.xticks(x_ticks, labels=x_labels)
    plt.savefig(os.path.join(VITAL_RUNS, f'{experimentName}_surgery_predictions.png'))
    plt.show()
    
    return preds
In [100]:
def run_experiment(
    experimentNamePrefix: str = None,
    useAbp: bool = True, 
    useEeg: bool = False, 
    useEcg: bool = False, 
    nResiduals: int = 12, 
    skip_connection: bool = False, 
    batch_size: int = 64, 
    learning_rate: float = 1e-4, 
    weight_decay: float = 0.0, 
    balance_labels: bool = False,
    pos_weight: float = None,
    max_epochs: int = 100, 
    patience: int = 25, 
    device: str = "cpu"
):
    reset_random_state()

    time_start = timer()

    experimentName = ""

    experimentOptions = [experimentNamePrefix, 'ABP', 'EEG', 'ECG', 'SKIPCONNECTION']
    experimentValues = [experimentNamePrefix is not None, useAbp, useEeg, useEcg, skip_connection]
    experimentFlags = [name for name, value in zip(experimentOptions, experimentValues) if value]
    if experimentFlags:
        experimentName = "_".join(experimentFlags)

    experimentName = f"{experimentName}_{nResiduals}_RESIDUAL_BLOCKS_{batch_size}_BATCH_SIZE_{learning_rate:.0e}_LEARNING_RATE"

    if weight_decay is not None and weight_decay != 0.0:
        experimentName = f"{experimentName}_{weight_decay:.0e}_WEIGHT_DECAY"

    predictionWindow = 'ALL' if PREDICTION_WINDOW == 'ALL' else f'{PREDICTION_WINDOW:03}'
    experimentName = f"{experimentName}_{predictionWindow}_MINS"

    maxCases = '_ALL' if MAX_CASES is None else f'{MAX_CASES:04}'
    experimentName = f"{experimentName}_{maxCases}_MAX_CASES"
    
    # Add unique uuid8 suffix to experiment name
    experimentName = f"{experimentName}_{uuid.uuid4().hex[:8]}"

    # default label split based on empirical data
    my_pos_weight = 4.0
    if balance_labels and pos_weight is not None:
        my_pos_weight = pos_weight

    # Fork stdout to file and console
    with ForkedStdout(os.path.join(VITAL_RUNS, f'{experimentName}.log')):
        print(f"Experiment Setup")
        print(f'  name:              {experimentName}')
        print(f'  prediction_window: {predictionWindow}')
        print(f'  max_cases:         {maxCases}')
        print(f'  use_abp:           {useAbp}')
        print(f'  use_eeg:           {useEeg}')
        print(f'  use_ecg:           {useEcg}')
        print(f'  n_residuals:       {nResiduals}')
        print(f'  skip_connection:   {skip_connection}')
        print(f'  batch_size:        {batch_size}')
        print(f'  learning_rate:     {learning_rate}')
        print(f'  weight_decay:      {weight_decay}')
        print(f'  balance_labels:    {balance_labels}')
        if balance_labels:
            print(f'  pos_weight:        {my_pos_weight}')
        print(f'  max_epochs:        {max_epochs}')
        print(f'  patience:          {patience}')
        print(f'  device:            {device}')
        print()

        train_loader = torch.utils.data.DataLoader(train_dataset, batch_size=batch_size, shuffle=True)
        val_loader = torch.utils.data.DataLoader(val_dataset, batch_size=batch_size, shuffle=True)
        test_loader = torch.utils.data.DataLoader(test_dataset, batch_size=batch_size, shuffle=False)

        # Disable final sigmoid activation for BCEWithLogitsLoss
        model = HypotensionCNN(useAbp, useEeg, useEcg, device, nResiduals, skip_connection, useSigmoid=(not balance_labels))
        model = model.to(device)
    
        if balance_labels:
            # Only the weight for the positive class
            loss_func = nn.BCEWithLogitsLoss(pos_weight=torch.tensor([my_pos_weight]).to(device))
        else:
            loss_func = nn.BCELoss()
        optimizer = torch.optim.Adam(model.parameters(), lr=learning_rate, weight_decay=weight_decay)

    
        print(f'Model Architecture')
        print(model)
        print()

        print(f'Training Loop')
        # Training loop
        best_epoch = 0
        train_losses = []
        val_losses = []
        best_loss = float('inf')
        no_improve_epochs = 0
        model_path = os.path.join(VITAL_MODELS, f"{experimentName}.model")

        all_models = []

        for i in range(max_epochs):
            # Train the model and get the training loss
            train_loss = train_model_one_iter(model, device, loss_func, optimizer, train_loader)
            train_losses.append(train_loss)
            # Calculate validate loss
            val_loss = evaluate_model(model, loss_func, val_loader)
            val_losses.append(val_loss)
            print(f"[{datetime.now()}] Completed epoch {i} with training loss {train_loss:.8f}, validation loss {val_loss:.8f}")

            # Save all intermediary models.
            tmp_model_path = os.path.join(VITAL_MODELS, f"{experimentName}_{i:04d}.model")
            torch.save(model.state_dict(), tmp_model_path)
            all_models.append(tmp_model_path)
  
            # Check if validation loss has improved
            if val_loss < best_loss:
                best_epoch = i
                best_loss = val_loss
                no_improve_epochs = 0
                torch.save(model.state_dict(), model_path)
                print(f"Validation loss improved to {val_loss:.8f}. Model saved.")
            else:
                no_improve_epochs += 1
                print(f"No improvement in validation loss. {no_improve_epochs} epochs without improvement.")

            # exit early if no improvement in loss over last 'patience' epochs
            if no_improve_epochs >= patience:
                print("Early stopping due to no improvement in validation loss.")
                break

        # Load best model from disk
        #print()
        #if os.path.exists(model_path):
        #    model.load_state_dict(torch.load(model_path))
        #    print(f"Loaded best model from disk from epoch {best_epoch}.")
        #else:
        #    print("No saved model found for f{experimentName}.")

        #model.train(False)

        # Plot the training and validation losses across all training epochs.
        plot_losses(train_losses, val_losses, best_epoch, experimentName)

        # Generate AUROC/AUPRC for each intermediate model generated across training epochs.
        val_aurocs, val_auprcs, val_accs, test_aurocs, test_auprcs, test_accs = \
            print_all_evals(model, all_models, device, val_loader, test_loader, loss_func, print_detailed=False)

        # Find model with highest AUROC. Plot AUROC/AUPRC across all epochs.
        np_test_aurocs, test_auroc_idx = plot_auroc_auprc(val_losses, val_aurocs, val_auprcs, val_accs, \
                                        test_aurocs, test_auprcs, test_accs, all_models, best_epoch, experimentName)

        ## AUROC / AUPRC - Model with Best Validation Loss
        best_model_val_loss = all_models[best_epoch]
    
        print(f'AUROC/AUPRC Plots - Best Model Based on Validation Loss')
        print(f'  Epoch with best Validation Loss:  {best_epoch:3}, {val_losses[best_epoch]:.4}')
        print(f'  Best Model Based on Validation Loss:')
        print(f'    {best_model_val_loss}')
        print()
        print(f'Generate Stats Based on Test Data')
        model.load_state_dict(torch.load(best_model_val_loss))
        #model.train(False)
        model.eval()
    
        best_model_val_test_predictions, best_model_val_test_labels, test_loss, \
            best_model_val_test_auroc, best_model_val_test_auprc, test_sensitivity, test_specificity, \
            best_model_val_test_threshold, best_model_val_accuracy = \
                eval_model(model, device, test_loader, loss_func, print_detailed=False)

        # y_test, y_pred
        display = RocCurveDisplay.from_predictions(
            best_model_val_test_labels,
            best_model_val_test_predictions,
            plot_chance_level=True
        )
        # Save plot to disk and show
        plt.savefig(os.path.join(VITAL_RUNS, f'{experimentName}_val_auroc.png'))
        plt.show()

        print(f'best_model_val_test_auroc: {best_model_val_test_auroc}')

        best_model_val_test_predictions_binary = \
        (best_model_val_test_predictions > best_model_val_test_threshold).astype(int)

        # y_test, y_pred
        display = PrecisionRecallDisplay.from_predictions(
            best_model_val_test_labels, 
            best_model_val_test_predictions_binary,
            plot_chance_level=True
        )
        # Save plot to disk and show
        plt.savefig(os.path.join(VITAL_RUNS, f'{experimentName}_val_auprc.png'))
        plt.show()

        print(f'best_model_val_test_auprc: {best_model_val_test_auprc}')
        print()

        ## AUROC / AUPRC - Model with Best AUROC
        # Find model with highest AUROC
        best_model_auroc = all_models[test_auroc_idx]

        print(f'AUROC/AUPRC Plots - Best Model Based on Model AUROC')
        print(f'  Epoch with best model Test AUROC: {test_auroc_idx:3}, {np_test_aurocs[test_auroc_idx]:.4}')
        print(f'  Best Model Based on Model AUROC:')
        print(f'    {best_model_auroc}')
        print()
        print(f'Generate Stats Based on Test Data')
        model.load_state_dict(torch.load(best_model_auroc))
        #model.train(False)
        model.eval()
    
        best_model_auroc_test_predictions, best_model_auroc_test_labels, test_loss, \
            best_model_auroc_test_auroc, best_model_auroc_test_auprc, test_sensitivity, test_specificity, \
            best_model_auroc_test_threshold, best_model_auroc_accuracy = \
                eval_model(model, device, test_loader, loss_func, print_detailed=False)

        # y_test, y_pred
        display = RocCurveDisplay.from_predictions(
            best_model_auroc_test_labels,
            best_model_auroc_test_predictions,
            plot_chance_level=True
        )
        # Save plot to disk and show
        plt.savefig(os.path.join(VITAL_RUNS, f'{experimentName}_auroc_auroc.png'))
        plt.show()

        print(f'best_model_auroc_test_auroc: {best_model_auroc_test_auroc}')

        best_model_auroc_test_predictions_binary = \
            (best_model_auroc_test_predictions > best_model_auroc_test_threshold).astype(int)

        # y_test, y_pred
        display = PrecisionRecallDisplay.from_predictions(
            best_model_auroc_test_labels, 
            best_model_auroc_test_predictions_binary,
            plot_chance_level=True
        )
        # Save plot to disk and show
        plt.savefig(os.path.join(VITAL_RUNS, f'{experimentName}_auroc_auprc.png'))
        plt.show()

        print(f"best_model_auroc_test_auprc: {best_model_auroc_test_auprc}")
        print()
        
        time_delta = np.round(timer() - time_start, 3)
        print(f'Total Processing Time: {time_delta:.4f} sec')
        
    return (model, best_model_val_loss, best_model_auroc, experimentName)

SPLITS¶

In [101]:
print('Time to experiment!')
Time to experiment!
In [102]:
PERFORM_TRACK_VALIDITY_CHECKS=True
In [103]:
# Default to the same set as above in segment validation checks.
my_cases_of_interest_idx = my_cases_of_interest_idx
#my_cases_of_interest_idx = cases_of_interest_idx[:10]
#mycoi = [1]

# If data already exists, reuse it.
if PERFORM_TRACK_VALIDITY_CHECKS \
    and positiveSegmentsMap is None \
    and negativeSegmentsMap is None \
    and iohEventsMap is None \
    and cleanEventsMap is None:
    positiveSegmentsMap, negativeSegmentsMap, iohEventsMap, cleanEventsMap = \
        extract_segments(my_cases_of_interest_idx, debug=False,
                         checkCache=False, forceWrite=False, returnSegments=True,
                         skipInvalidCleanEvents=SKIP_INVALID_CLEAN_EVENTS,
                         skipInvalidIohEvents=SKIP_INVALID_IOH_EVENTS)
84: positiveSegments: 4, negativeSegments: 15
198: positiveSegments: 4, negativeSegments: 12
60: positiveSegments: 4, negativeSegments: 3
16: positiveSegments: 8, negativeSegments: 6
27: positiveSegments: 8, negativeSegments: 12
In [104]:
MULTI_RUN = True
MAX_EPOCHS=200
PATIENCE=20

ABP, EEG, and ECG Splits¶

In [105]:
RUN_ME = True
DISPLAY_MODEL_PREDICTION=True
DISPLAY_MODEL_PREDICTION_FIRST_ONLY=True

if RUN_ME:
    (model, best_model_val_loss, best_model_auroc, experimentName) = run_experiment(
        experimentNamePrefix=None, 
        useAbp=True, 
        useEeg=False, 
        useEcg=False,
        nResiduals=12, 
        skip_connection=False,
        batch_size=128,
        learning_rate=1e-4,
        weight_decay=1e-1,
        balance_labels=False,
        #pos_weight=2.0,
        pos_weight=None,
        max_epochs=MAX_EPOCHS,
        patience=PATIENCE,
        device=device
    )

    if DISPLAY_MODEL_PREDICTION:
        for case_id_to_check in my_cases_of_interest_idx:
            preds = predictionsForModel(case_id_to_check, model, best_model_val_loss, device)
            printModelPrediction(case_id_to_check, positiveSegmentsMap, 
                            negativeSegmentsMap, iohEventsMap, cleanEventsMap, preds, experimentName)

            if DISPLAY_MODEL_PREDICTION_FIRST_ONLY:
                break
Experiment Setup
  name:              ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4
  prediction_window: 003
  max_cases:         _ALL
  use_abp:           True
  use_eeg:           False
  use_ecg:           False
  n_residuals:       12
  skip_connection:   False
  batch_size:        128
  learning_rate:     0.0001
  weight_decay:      0.1
  balance_labels:    False
  max_epochs:        200
  patience:          20
  device:            mps

Model Architecture
HypotensionCNN(
  (abpResiduals): Sequential(
    (0): ResidualBlock(
      (bn1): BatchNorm1d(1, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(1, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (bn2): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (residualConv): Conv1d(1, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    )
    (1): ResidualBlock(
      (bn1): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (bn2): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (residualConv): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
    )
    (2): ResidualBlock(
      (bn1): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (bn2): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (residualConv): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    )
    (3): ResidualBlock(
      (bn1): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (bn2): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (residualConv): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
    )
    (4): ResidualBlock(
      (bn1): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (bn2): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (residualConv): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    )
    (5): ResidualBlock(
      (bn1): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(2, 4, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (bn2): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(4, 4, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (residualConv): Conv1d(2, 4, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
    )
    (6): ResidualBlock(
      (bn1): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    )
    (7): ResidualBlock(
      (bn1): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
    )
    (8): ResidualBlock(
      (bn1): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=1, dilation=1, ceil_mode=False)
    )
    (9): ResidualBlock(
      (bn1): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
    )
    (10): ResidualBlock(
      (bn1): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(4, 6, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(6, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(6, 6, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(4, 6, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    )
    (11): ResidualBlock(
      (bn1): BatchNorm1d(6, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(6, 6, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(6, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(6, 6, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(6, 6, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
    )
  )
  (abpFc): Linear(in_features=2814, out_features=32, bias=True)
  (fullLinear1): Linear(in_features=32, out_features=16, bias=True)
  (fullLinear2): Linear(in_features=16, out_features=1, bias=True)
  (sigmoid): Sigmoid()
)

Training Loop
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:46<00:00,  1.97it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.54it/s]
[2024-05-05 03:52:19.545744] Completed epoch 0 with training loss 0.50833422, validation loss 0.59529662
Validation loss improved to 0.59529662. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.09it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.65it/s]
[2024-05-05 03:53:09.774975] Completed epoch 1 with training loss 0.44851246, validation loss 0.62959111
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.08it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.64it/s]
[2024-05-05 03:54:00.021624] Completed epoch 2 with training loss 0.43924439, validation loss 0.65783167
No improvement in validation loss. 2 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.09it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.66it/s]
[2024-05-05 03:54:50.141204] Completed epoch 3 with training loss 0.44168407, validation loss 0.55741930
Validation loss improved to 0.55741930. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.09it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
[2024-05-05 03:55:40.332541] Completed epoch 4 with training loss 0.43743125, validation loss 0.58566737
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.08it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.64it/s]
[2024-05-05 03:56:30.571501] Completed epoch 5 with training loss 0.43733540, validation loss 0.55916595
No improvement in validation loss. 2 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.07it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.65it/s]
[2024-05-05 03:57:21.054302] Completed epoch 6 with training loss 0.43729058, validation loss 0.56147313
No improvement in validation loss. 3 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:43<00:00,  2.09it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.65it/s]
[2024-05-05 03:58:11.090365] Completed epoch 7 with training loss 0.43614846, validation loss 0.55170906
Validation loss improved to 0.55170906. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:43<00:00,  2.09it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.64it/s]
[2024-05-05 03:59:01.186853] Completed epoch 8 with training loss 0.43605489, validation loss 0.54812640
Validation loss improved to 0.54812640. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.08it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.64it/s]
[2024-05-05 03:59:51.506156] Completed epoch 9 with training loss 0.43672666, validation loss 0.55010986
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.09it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.64it/s]
[2024-05-05 04:00:41.647663] Completed epoch 10 with training loss 0.43778178, validation loss 0.55915105
No improvement in validation loss. 2 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.07it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.62it/s]
[2024-05-05 04:01:32.128796] Completed epoch 11 with training loss 0.43698445, validation loss 0.55136228
No improvement in validation loss. 3 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.08it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.65it/s]
[2024-05-05 04:02:22.362850] Completed epoch 12 with training loss 0.43375283, validation loss 0.57088220
No improvement in validation loss. 4 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.09it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.65it/s]
[2024-05-05 04:03:12.456629] Completed epoch 13 with training loss 0.43675232, validation loss 0.58583933
No improvement in validation loss. 5 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.09it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.64it/s]
[2024-05-05 04:04:02.685938] Completed epoch 14 with training loss 0.43537372, validation loss 0.57108909
No improvement in validation loss. 6 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.09it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.64it/s]
[2024-05-05 04:04:52.902474] Completed epoch 15 with training loss 0.43363327, validation loss 0.53953600
Validation loss improved to 0.53953600. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.09it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
[2024-05-05 04:05:43.128735] Completed epoch 16 with training loss 0.43471885, validation loss 0.56788933
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.09it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
[2024-05-05 04:06:33.277746] Completed epoch 17 with training loss 0.43666270, validation loss 0.57144058
No improvement in validation loss. 2 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:43<00:00,  2.09it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.65it/s]
[2024-05-05 04:07:23.349901] Completed epoch 18 with training loss 0.43400231, validation loss 0.52189499
Validation loss improved to 0.52189499. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.08it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.64it/s]
[2024-05-05 04:08:13.669210] Completed epoch 19 with training loss 0.43291110, validation loss 0.54700166
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.09it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.64it/s]
[2024-05-05 04:09:03.821702] Completed epoch 20 with training loss 0.43403870, validation loss 0.53983629
No improvement in validation loss. 2 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.09it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.64it/s]
[2024-05-05 04:09:53.993382] Completed epoch 21 with training loss 0.43454343, validation loss 0.54865545
No improvement in validation loss. 3 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.09it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.65it/s]
[2024-05-05 04:10:44.139950] Completed epoch 22 with training loss 0.43107948, validation loss 0.52294970
No improvement in validation loss. 4 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.09it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.64it/s]
[2024-05-05 04:11:34.293047] Completed epoch 23 with training loss 0.43061355, validation loss 0.53748000
No improvement in validation loss. 5 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.05it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.53it/s]
[2024-05-05 04:12:25.608847] Completed epoch 24 with training loss 0.42893124, validation loss 0.59605163
No improvement in validation loss. 6 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:45<00:00,  2.04it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.60it/s]
[2024-05-05 04:13:16.840996] Completed epoch 25 with training loss 0.43077737, validation loss 0.51150513
Validation loss improved to 0.51150513. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.06it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.59it/s]
[2024-05-05 04:14:07.763979] Completed epoch 26 with training loss 0.43234801, validation loss 0.52688259
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.07it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.64it/s]
[2024-05-05 04:14:58.388540] Completed epoch 27 with training loss 0.42823517, validation loss 0.52790546
No improvement in validation loss. 2 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:43<00:00,  2.10it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.62it/s]
[2024-05-05 04:15:48.388750] Completed epoch 28 with training loss 0.43162555, validation loss 0.53847039
No improvement in validation loss. 3 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:43<00:00,  2.11it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.66it/s]
[2024-05-05 04:16:38.005066] Completed epoch 29 with training loss 0.43114683, validation loss 0.51702124
No improvement in validation loss. 4 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:43<00:00,  2.11it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.66it/s]
[2024-05-05 04:17:27.623339] Completed epoch 30 with training loss 0.42790487, validation loss 0.50053477
Validation loss improved to 0.50053477. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:43<00:00,  2.09it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.64it/s]
[2024-05-05 04:18:17.682877] Completed epoch 31 with training loss 0.42715064, validation loss 0.51605386
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:43<00:00,  2.10it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.65it/s]
[2024-05-05 04:19:07.573568] Completed epoch 32 with training loss 0.42941990, validation loss 0.49642193
Validation loss improved to 0.49642193. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:43<00:00,  2.10it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.64it/s]
[2024-05-05 04:19:57.456702] Completed epoch 33 with training loss 0.42918620, validation loss 0.54775763
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:43<00:00,  2.10it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.66it/s]
[2024-05-05 04:20:47.223091] Completed epoch 34 with training loss 0.42818713, validation loss 0.48913077
Validation loss improved to 0.48913077. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:43<00:00,  2.11it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.66it/s]
[2024-05-05 04:21:36.902944] Completed epoch 35 with training loss 0.42712817, validation loss 0.53503311
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:43<00:00,  2.10it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.65it/s]
[2024-05-05 04:22:26.720432] Completed epoch 36 with training loss 0.42801517, validation loss 0.53061211
No improvement in validation loss. 2 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:43<00:00,  2.11it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.67it/s]
[2024-05-05 04:23:16.322701] Completed epoch 37 with training loss 0.42817730, validation loss 0.52683055
No improvement in validation loss. 3 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:43<00:00,  2.10it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.65it/s]
[2024-05-05 04:24:06.239082] Completed epoch 38 with training loss 0.42782038, validation loss 0.55299819
No improvement in validation loss. 4 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:43<00:00,  2.10it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.65it/s]
[2024-05-05 04:24:56.203175] Completed epoch 39 with training loss 0.42918590, validation loss 0.44656762
Validation loss improved to 0.44656762. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:43<00:00,  2.09it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.62it/s]
[2024-05-05 04:25:46.311207] Completed epoch 40 with training loss 0.42814475, validation loss 0.47097564
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:43<00:00,  2.10it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.65it/s]
[2024-05-05 04:26:36.279726] Completed epoch 41 with training loss 0.43031609, validation loss 0.52885616
No improvement in validation loss. 2 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:43<00:00,  2.09it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.65it/s]
[2024-05-05 04:27:26.312889] Completed epoch 42 with training loss 0.42799208, validation loss 0.47199538
No improvement in validation loss. 3 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:43<00:00,  2.09it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.65it/s]
[2024-05-05 04:28:16.382977] Completed epoch 43 with training loss 0.42945024, validation loss 0.52586907
No improvement in validation loss. 4 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:43<00:00,  2.09it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.66it/s]
[2024-05-05 04:29:06.382045] Completed epoch 44 with training loss 0.42882037, validation loss 0.49342996
No improvement in validation loss. 5 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:43<00:00,  2.09it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.64it/s]
[2024-05-05 04:29:56.450936] Completed epoch 45 with training loss 0.42804122, validation loss 0.43838733
Validation loss improved to 0.43838733. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:43<00:00,  2.09it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.65it/s]
[2024-05-05 04:30:46.526040] Completed epoch 46 with training loss 0.43102881, validation loss 0.49334717
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:43<00:00,  2.09it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.64it/s]
[2024-05-05 04:31:36.543630] Completed epoch 47 with training loss 0.43131077, validation loss 0.44217652
No improvement in validation loss. 2 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.09it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.64it/s]
[2024-05-05 04:32:26.742935] Completed epoch 48 with training loss 0.42455044, validation loss 0.52443719
No improvement in validation loss. 3 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:43<00:00,  2.09it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
[2024-05-05 04:33:16.793947] Completed epoch 49 with training loss 0.42907739, validation loss 0.48798549
No improvement in validation loss. 4 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.09it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.65it/s]
[2024-05-05 04:34:06.992829] Completed epoch 50 with training loss 0.42762011, validation loss 0.42887950
Validation loss improved to 0.42887950. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:43<00:00,  2.09it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.65it/s]
[2024-05-05 04:34:57.031953] Completed epoch 51 with training loss 0.42845961, validation loss 0.47757357
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.09it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
[2024-05-05 04:35:47.256087] Completed epoch 52 with training loss 0.42326748, validation loss 0.45507789
No improvement in validation loss. 2 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:43<00:00,  2.09it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.65it/s]
[2024-05-05 04:36:37.311929] Completed epoch 53 with training loss 0.43148181, validation loss 0.44295955
No improvement in validation loss. 3 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.09it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
[2024-05-05 04:37:27.489711] Completed epoch 54 with training loss 0.43180287, validation loss 0.44134885
No improvement in validation loss. 4 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.08it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
[2024-05-05 04:38:17.816844] Completed epoch 55 with training loss 0.42932859, validation loss 0.51395851
No improvement in validation loss. 5 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.08it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.64it/s]
[2024-05-05 04:39:08.234737] Completed epoch 56 with training loss 0.42646354, validation loss 0.50408709
No improvement in validation loss. 6 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.07it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
[2024-05-05 04:39:58.717660] Completed epoch 57 with training loss 0.42704985, validation loss 0.42835426
Validation loss improved to 0.42835426. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.07it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
[2024-05-05 04:40:49.209214] Completed epoch 58 with training loss 0.42734888, validation loss 0.42591697
Validation loss improved to 0.42591697. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.08it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
[2024-05-05 04:41:39.677120] Completed epoch 59 with training loss 0.42580131, validation loss 0.45672274
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.08it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
[2024-05-05 04:42:30.110149] Completed epoch 60 with training loss 0.42457071, validation loss 0.43872994
No improvement in validation loss. 2 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.08it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
[2024-05-05 04:43:20.521599] Completed epoch 61 with training loss 0.42612121, validation loss 0.49997139
No improvement in validation loss. 3 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.08it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.64it/s]
[2024-05-05 04:44:10.909751] Completed epoch 62 with training loss 0.42727515, validation loss 0.48263228
No improvement in validation loss. 4 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.08it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
[2024-05-05 04:45:01.226157] Completed epoch 63 with training loss 0.42592803, validation loss 0.44971976
No improvement in validation loss. 5 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.08it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
[2024-05-05 04:45:51.610174] Completed epoch 64 with training loss 0.42711225, validation loss 0.52797014
No improvement in validation loss. 6 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.08it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
[2024-05-05 04:46:41.978635] Completed epoch 65 with training loss 0.42680052, validation loss 0.46766883
No improvement in validation loss. 7 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.08it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
[2024-05-05 04:47:32.291262] Completed epoch 66 with training loss 0.42457324, validation loss 0.50480855
No improvement in validation loss. 8 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.08it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
[2024-05-05 04:48:22.693787] Completed epoch 67 with training loss 0.42601913, validation loss 0.46518394
No improvement in validation loss. 9 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.08it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
[2024-05-05 04:49:13.077225] Completed epoch 68 with training loss 0.42326871, validation loss 0.45578936
No improvement in validation loss. 10 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.08it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
[2024-05-05 04:50:03.472593] Completed epoch 69 with training loss 0.42562884, validation loss 0.47505328
No improvement in validation loss. 11 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.07it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
[2024-05-05 04:50:54.010928] Completed epoch 70 with training loss 0.42315254, validation loss 0.49330550
No improvement in validation loss. 12 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.08it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.60it/s]
[2024-05-05 04:51:44.446959] Completed epoch 71 with training loss 0.42616937, validation loss 0.46326771
No improvement in validation loss. 13 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.08it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
[2024-05-05 04:52:34.802732] Completed epoch 72 with training loss 0.43005762, validation loss 0.48057401
No improvement in validation loss. 14 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.08it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
[2024-05-05 04:53:25.196331] Completed epoch 73 with training loss 0.42651272, validation loss 0.50960070
No improvement in validation loss. 15 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.08it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.64it/s]
[2024-05-05 04:54:15.606277] Completed epoch 74 with training loss 0.42528972, validation loss 0.43683875
No improvement in validation loss. 16 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.08it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
[2024-05-05 04:55:05.990092] Completed epoch 75 with training loss 0.42661434, validation loss 0.53190541
No improvement in validation loss. 17 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.08it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.65it/s]
[2024-05-05 04:55:56.295999] Completed epoch 76 with training loss 0.42595631, validation loss 0.45761618
No improvement in validation loss. 18 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.09it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.64it/s]
[2024-05-05 04:56:46.415769] Completed epoch 77 with training loss 0.42501685, validation loss 0.48454574
No improvement in validation loss. 19 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:44<00:00,  2.09it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.66it/s]
[2024-05-05 04:57:36.471131] Completed epoch 78 with training loss 0.42659798, validation loss 0.42977071
No improvement in validation loss. 20 epochs without improvement.
Early stopping due to no improvement in validation loss.

Plot Validation and Loss Values from Training
  Epoch with best Validation Loss:   58, 0.4259
Generate AUROC/AUPRC for Each Intermediate Model

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0000.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:05<00:00,  2.93it/s]
Loss: 0.5943517349660397
AUROC: 0.8386042453909374
AUPRC: 0.7068078088707609
Sensitivity: 0.7789661319073083
Specificity: 0.7496473906911142
Threshold: 0.13
Accuracy:  0.7579585649317837

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.55it/s]
Loss: 0.5523598677582211
AUROC: 0.834096312248085
AUPRC: 0.6646452592321672
Sensitivity: 0.7444519166106254
Specificity: 0.7875029811590747
Threshold: 0.14
Accuracy:  0.7762323943661972

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0001.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.58it/s]
Loss: 0.6270489171147346
AUROC: 0.8425828851863864
AUPRC: 0.7232692997330766
Sensitivity: 0.7825311942959001
Specificity: 0.7461212976022567
Threshold: 0.1
Accuracy:  0.7564426478019202

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.60it/s]
Loss: 0.5783150699403551
AUROC: 0.8389189334836249
AUPRC: 0.6770373959107021
Sensitivity: 0.7491593813046402
Specificity: 0.7894109229668496
Threshold: 0.11
Accuracy:  0.7788732394366197

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0002.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.59it/s]
Loss: 0.6589974351227283
AUROC: 0.8436337992050262
AUPRC: 0.7281657181791784
Sensitivity: 0.750445632798574
Specificity: 0.7799717912552891
Threshold: 0.09
Accuracy:  0.7716018191005558

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.58it/s]
Loss: 0.6023767676618365
AUROC: 0.8391154052989009
AUPRC: 0.6803072157853195
Sensitivity: 0.7700067249495629
Specificity: 0.7662771285475793
Threshold: 0.09
Accuracy:  0.7672535211267606

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0003.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.62it/s]
Loss: 0.5622062962502241
AUROC: 0.8435797450150725
AUPRC: 0.7260173688157429
Sensitivity: 0.7540106951871658
Specificity: 0.770098730606488
Threshold: 0.15
Accuracy:  0.7655381505811015

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.60it/s]
Loss: 0.5225187043348948
AUROC: 0.8403070670029836
AUPRC: 0.6809590121113888
Sensitivity: 0.773369199731002
Specificity: 0.7579298831385642
Threshold: 0.15
Accuracy:  0.7619718309859155

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0004.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.65it/s]
Loss: 0.5888707414269447
AUROC: 0.8434175824452104
AUPRC: 0.726826344602989
Sensitivity: 0.7700534759358288
Specificity: 0.7517630465444288
Threshold: 0.12
Accuracy:  0.7569479535118747

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.61it/s]
Loss: 0.5408001144727071
AUROC: 0.8396969618721182
AUPRC: 0.6810882121210099
Sensitivity: 0.753866845998655
Specificity: 0.7805866921058908
Threshold: 0.13
Accuracy:  0.7735915492957747

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0005.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.60it/s]
Loss: 0.5520273726433516
AUROC: 0.8436903675433501
AUPRC: 0.7267166898181625
Sensitivity: 0.7468805704099821
Specificity: 0.7834978843441467
Threshold: 0.16
Accuracy:  0.7731177362304194

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.62it/s]
Loss: 0.5134222388267518
AUROC: 0.8398379404236509
AUPRC: 0.6808079770310034
Sensitivity: 0.7679892400806994
Specificity: 0.7643691867398045
Threshold: 0.16
Accuracy:  0.7653169014084507

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0006.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.64it/s]
Loss: 0.5611043497920036
AUROC: 0.8431397690503307
AUPRC: 0.7265807889362774
Sensitivity: 0.750445632798574
Specificity: 0.7778561354019746
Threshold: 0.15
Accuracy:  0.7700859019706923

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.59it/s]
Loss: 0.5175240102741453
AUROC: 0.8391105937442411
AUPRC: 0.68039661555115
Sensitivity: 0.7726967047747142
Specificity: 0.7591223467684236
Threshold: 0.15
Accuracy:  0.7626760563380282

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0007.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.62it/s]
Loss: 0.5472812242805958
AUROC: 0.8429952055190585
AUPRC: 0.7261938423170143
Sensitivity: 0.7736185383244206
Specificity: 0.7482369534555712
Threshold: 0.15
Accuracy:  0.7554320363820111

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.60it/s]
Loss: 0.5127201934655508
AUROC: 0.8385319241038198
AUPRC: 0.6798541627466503
Sensitivity: 0.7652992602555481
Specificity: 0.7703315048891008
Threshold: 0.16
Accuracy:  0.7690140845070422

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0008.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
Loss: 0.5508644916117191
AUROC: 0.8428041302429421
AUPRC: 0.7264867848531942
Sensitivity: 0.7557932263814616
Specificity: 0.7764456981664316
Threshold: 0.15
Accuracy:  0.7705912076806468

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.62it/s]
Loss: 0.5141454226440854
AUROC: 0.8381702555785565
AUPRC: 0.6793443080117737
Sensitivity: 0.7740416946872899
Specificity: 0.7553064631528739
Threshold: 0.15
Accuracy:  0.7602112676056338

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0009.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.64it/s]
Loss: 0.5466129556298256
AUROC: 0.842601741299161
AUPRC: 0.7261017428358801
Sensitivity: 0.7664884135472371
Specificity: 0.7595204513399154
Threshold: 0.15
Accuracy:  0.7614957049014653

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.60it/s]
Loss: 0.5090540690554513
AUROC: 0.8375915859381352
AUPRC: 0.6789376322309876
Sensitivity: 0.753866845998655
Specificity: 0.7767708084903411
Threshold: 0.16
Accuracy:  0.770774647887324

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0010.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.59it/s]
Loss: 0.5606414116919041
AUROC: 0.8418889802362797
AUPRC: 0.7264391247905663
Sensitivity: 0.768270944741533
Specificity: 0.7566995768688294
Threshold: 0.14
Accuracy:  0.7599797877716018

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.62it/s]
Loss: 0.5213198800881703
AUROC: 0.836534086416484
AUPRC: 0.6784095169270021
Sensitivity: 0.753866845998655
Specificity: 0.7755783448604817
Threshold: 0.15
Accuracy:  0.7698943661971831

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0011.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.64it/s]
Loss: 0.5541900433599949
AUROC: 0.8416891054408685
AUPRC: 0.7253998246008071
Sensitivity: 0.768270944741533
Specificity: 0.7574047954866009
Threshold: 0.15
Accuracy:  0.7604850934815564

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.60it/s]
Loss: 0.5117154866456985
AUROC: 0.8362421052412106
AUPRC: 0.6780387692472195
Sensitivity: 0.7552118359112306
Specificity: 0.7736704030527068
Threshold: 0.16
Accuracy:  0.768838028169014

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0012.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.61it/s]
Loss: 0.5712853763252497
AUROC: 0.8410769103127852
AUPRC: 0.7254298762979967
Sensitivity: 0.768270944741533
Specificity: 0.7588152327221439
Threshold: 0.13
Accuracy:  0.7614957049014653

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.61it/s]
Loss: 0.5283849812216229
AUROC: 0.8353547743693616
AUPRC: 0.6775398786365191
Sensitivity: 0.7505043712172159
Specificity: 0.7743858812306225
Threshold: 0.14
Accuracy:  0.7681338028169014

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0013.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
Loss: 0.5903265997767448
AUROC: 0.8403339794694644
AUPRC: 0.7255752356111844
Sensitivity: 0.7450980392156863
Specificity: 0.7884344146685472
Threshold: 0.12
Accuracy:  0.7761495704901465

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.58it/s]
Loss: 0.5450693256325192
AUROC: 0.8345623915094664
AUPRC: 0.6765658624348907
Sensitivity: 0.7659717552118359
Specificity: 0.755067970426902
Threshold: 0.12
Accuracy:  0.7579225352112676

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0014.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.61it/s]
Loss: 0.5708944778889418
AUROC: 0.8400888500033942
AUPRC: 0.7255404330139339
Sensitivity: 0.7540106951871658
Specificity: 0.7820874471086037
Threshold: 0.13
Accuracy:  0.7741283476503285

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.60it/s]
Loss: 0.530538723203871
AUROC: 0.8342605466471403
AUPRC: 0.6759316322919244
Sensitivity: 0.7720242098184263
Specificity: 0.7491056522776055
Threshold: 0.13
Accuracy:  0.7551056338028169

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0015.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.61it/s]
Loss: 0.5392202064394951
AUROC: 0.8404710005556268
AUPRC: 0.7258800721400049
Sensitivity: 0.7575757575757576
Specificity: 0.7778561354019746
Threshold: 0.16
Accuracy:  0.7721071248105104

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.60it/s]
Loss: 0.5043413165542815
AUROC: 0.8340850852872121
AUPRC: 0.6759820008660793
Sensitivity: 0.7424344317417619
Specificity: 0.7779632721202003
Threshold: 0.17
Accuracy:  0.768661971830986

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0016.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
Loss: 0.5706335082650185
AUROC: 0.8392503815220151
AUPRC: 0.7247886280888697
Sensitivity: 0.7522281639928698
Specificity: 0.7799717912552891
Threshold: 0.13
Accuracy:  0.7721071248105104

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.62it/s]
Loss: 0.5264366209506989
AUROC: 0.8337476349204034
AUPRC: 0.6744988624423437
Sensitivity: 0.7760591795561533
Specificity: 0.7469592177438588
Threshold: 0.13
Accuracy:  0.7545774647887324

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0017.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.64it/s]
Loss: 0.5764571577310562
AUROC: 0.8395797349584788
AUPRC: 0.7260287443235183
Sensitivity: 0.7522281639928698
Specificity: 0.7771509167842031
Threshold: 0.12
Accuracy:  0.7700859019706923

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.60it/s]
Loss: 0.5316736449797949
AUROC: 0.8334693666759102
AUPRC: 0.6746383616559445
Sensitivity: 0.7424344317417619
Specificity: 0.7722394466968757
Threshold: 0.13
Accuracy:  0.7644366197183099

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0018.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.62it/s]
Loss: 0.5282786004245281
AUROC: 0.8402459842765161
AUPRC: 0.7251759858093905
Sensitivity: 0.768270944741533
Specificity: 0.7524682651622003
Threshold: 0.16
Accuracy:  0.7569479535118747

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.62it/s]
Loss: 0.490554173456298
AUROC: 0.8350847659603677
AUPRC: 0.6752943127973161
Sensitivity: 0.7673167451244116
Specificity: 0.7495826377295493
Threshold: 0.17
Accuracy:  0.754225352112676

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0019.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
Loss: 0.5472211912274361
AUROC: 0.8392151834448358
AUPRC: 0.7264099125741273
Sensitivity: 0.7700534759358288
Specificity: 0.7538787023977433
Threshold: 0.16
Accuracy:  0.7584638706417383

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.59it/s]
Loss: 0.5063754770490858
AUROC: 0.8334496393018049
AUPRC: 0.6734291269491501
Sensitivity: 0.7579018157363819
Specificity: 0.7536370140710709
Threshold: 0.17
Accuracy:  0.7547535211267605

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0020.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.61it/s]
Loss: 0.5420079845935106
AUROC: 0.8410266273453862
AUPRC: 0.7260623269849847
Sensitivity: 0.7736185383244206
Specificity: 0.7581100141043724
Threshold: 0.15
Accuracy:  0.7625063163213744

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.62it/s]
Loss: 0.5038113607300653
AUROC: 0.837245635158094
AUPRC: 0.6753596404432199
Sensitivity: 0.7679892400806994
Specificity: 0.7545909849749582
Threshold: 0.16
Accuracy:  0.7580985915492958

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0021.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
Loss: 0.5423388537019491
AUROC: 0.8386256156520822
AUPRC: 0.7237014755898828
Sensitivity: 0.7700534759358288
Specificity: 0.7559943582510579
Threshold: 0.15
Accuracy:  0.7599797877716018

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.60it/s]
Loss: 0.5030771533648173
AUROC: 0.8336011230810116
AUPRC: 0.6723948845981513
Sensitivity: 0.7605917955615333
Specificity: 0.753398521345099
Threshold: 0.16
Accuracy:  0.7552816901408451

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0022.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
Loss: 0.5252375956624746
AUROC: 0.8419241783134589
AUPRC: 0.7291837339509644
Sensitivity: 0.7700534759358288
Specificity: 0.7496473906911142
Threshold: 0.17
Accuracy:  0.7554320363820111

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.61it/s]
Loss: 0.48768208821614584
AUROC: 0.836985490436153
AUPRC: 0.6742906780546973
Sensitivity: 0.7673167451244116
Specificity: 0.7474362031958025
Threshold: 0.18
Accuracy:  0.7526408450704225

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0023.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.64it/s]
Loss: 0.5414294917136431
AUROC: 0.8386683561743713
AUPRC: 0.7256368120729092
Sensitivity: 0.7736185383244206
Specificity: 0.7489421720733427
Threshold: 0.16
Accuracy:  0.7559373420919656

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.60it/s]
Loss: 0.5000352064768473
AUROC: 0.8334074580059537
AUPRC: 0.6703030025794791
Sensitivity: 0.7659717552118359
Specificity: 0.7455282613880276
Threshold: 0.17
Accuracy:  0.7508802816901409

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0024.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.64it/s]
Loss: 0.6086224745959044
AUROC: 0.8365929266949759
AUPRC: 0.7215333170824291
Sensitivity: 0.7629233511586453
Specificity: 0.7609308885754584
Threshold: 0.1
Accuracy:  0.7614957049014653

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.60it/s]
Loss: 0.5535019195742077
AUROC: 0.8311350409326974
AUPRC: 0.6689329318250058
Sensitivity: 0.7531943510423672
Specificity: 0.758406868590508
Threshold: 0.11
Accuracy:  0.7570422535211268

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0025.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.61it/s]
Loss: 0.5077817067503929
AUROC: 0.8388632026730425
AUPRC: 0.7283077297353142
Sensitivity: 0.7718360071301248
Specificity: 0.7418899858956276
Threshold: 0.2
Accuracy:  0.7503789792824659

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.60it/s]
Loss: 0.47750452558199563
AUROC: 0.832886366636295
AUPRC: 0.6710144828033199
Sensitivity: 0.7565568258238063
Specificity: 0.7476746959217744
Threshold: 0.21
Accuracy:  0.75

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0026.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
Loss: 0.5235172882676125
AUROC: 0.8395960769228836
AUPRC: 0.7244477742537978
Sensitivity: 0.7664884135472371
Specificity: 0.7552891396332864
Threshold: 0.16
Accuracy:  0.7584638706417383

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.62it/s]
Loss: 0.4924885027938419
AUROC: 0.8355798749348635
AUPRC: 0.6713714782295894
Sensitivity: 0.7592468056489576
Specificity: 0.7548294777009301
Threshold: 0.17
Accuracy:  0.7559859154929578

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0027.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.64it/s]
Loss: 0.5239715054631233
AUROC: 0.8419078363490544
AUPRC: 0.7309529091184153
Sensitivity: 0.750445632798574
Specificity: 0.771509167842031
Threshold: 0.17
Accuracy:  0.7655381505811015

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.61it/s]
Loss: 0.488392522599962
AUROC: 0.8371210158924047
AUPRC: 0.6714974733652982
Sensitivity: 0.7484868863483524
Specificity: 0.7674695921774386
Threshold: 0.18
Accuracy:  0.7625

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0028.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.64it/s]
Loss: 0.536174762994051
AUROC: 0.842916009845405
AUPRC: 0.7281411821338681
Sensitivity: 0.7647058823529411
Specificity: 0.7637517630465445
Threshold: 0.16
Accuracy:  0.764022233451238

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.62it/s]
Loss: 0.4979488319820828
AUROC: 0.8386531752812474
AUPRC: 0.6722908195781618
Sensitivity: 0.7545393409549428
Specificity: 0.7603148103982829
Threshold: 0.17
Accuracy:  0.7588028169014085

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0029.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.64it/s]
Loss: 0.5135802887380123
AUROC: 0.8437959617748882
AUPRC: 0.7307481713941606
Sensitivity: 0.768270944741533
Specificity: 0.7559943582510579
Threshold: 0.19
Accuracy:  0.7594744820616472

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.59it/s]
Loss: 0.4762341936429342
AUROC: 0.8394761115132324
AUPRC: 0.6739902400462374
Sensitivity: 0.7666442501681238
Specificity: 0.7536370140710709
Threshold: 0.2
Accuracy:  0.7570422535211268

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0030.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
Loss: 0.5030414089560509
AUROC: 0.8403553497306091
AUPRC: 0.7274533883187395
Sensitivity: 0.750445632798574
Specificity: 0.7736248236953456
Threshold: 0.19
Accuracy:  0.7670540677109652

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.62it/s]
Loss: 0.46695818901062014
AUROC: 0.8364427470705251
AUPRC: 0.6740392115900021
Sensitivity: 0.7451244115669132
Specificity: 0.7646076794657763
Threshold: 0.2
Accuracy:  0.7595070422535212

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0031.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.64it/s]
Loss: 0.5143595933914185
AUROC: 0.8447060834848108
AUPRC: 0.7326331612187782
Sensitivity: 0.7486631016042781
Specificity: 0.7708039492242595
Threshold: 0.18
Accuracy:  0.7645275391611925

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.59it/s]
Loss: 0.48152931167019736
AUROC: 0.8401260723551968
AUPRC: 0.6739781319811924
Sensitivity: 0.7726967047747142
Specificity: 0.7462437395659433
Threshold: 0.18
Accuracy:  0.7531690140845071

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0032.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
Loss: 0.5006893947720528
AUROC: 0.8452491395327203
AUPRC: 0.7319138966247943
Sensitivity: 0.7557932263814616
Specificity: 0.7623413258110014
Threshold: 0.21
Accuracy:  0.7604850934815564

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.61it/s]
Loss: 0.46880815327167513
AUROC: 0.8405349743087039
AUPRC: 0.6739309849247453
Sensitivity: 0.7585743106926698
Specificity: 0.7646076794657763
Threshold: 0.22
Accuracy:  0.7630281690140845

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0033.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
Loss: 0.5524563882499933
AUROC: 0.8393006644894142
AUPRC: 0.7269264426334188
Sensitivity: 0.7629233511586453
Specificity: 0.7510578279266573
Threshold: 0.14
Accuracy:  0.7544214249621021

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.60it/s]
Loss: 0.5075899196995629
AUROC: 0.8355675252779033
AUPRC: 0.6687685315976398
Sensitivity: 0.7626092804303968
Specificity: 0.7507751013594085
Threshold: 0.15
Accuracy:  0.7538732394366198

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0034.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
Loss: 0.4835874978452921
AUROC: 0.8451988565653215
AUPRC: 0.7309597583937328
Sensitivity: 0.7522281639928698
Specificity: 0.768688293370945
Threshold: 0.22
Accuracy:  0.764022233451238

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.59it/s]
Loss: 0.45644150409433576
AUROC: 0.8414905490641447
AUPRC: 0.6735509749834355
Sensitivity: 0.7700067249495629
Specificity: 0.7433818268542809
Threshold: 0.22
Accuracy:  0.7503521126760564

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0035.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
Loss: 0.535468241199851
AUROC: 0.844850647016083
AUPRC: 0.7306892367149946
Sensitivity: 0.7700534759358288
Specificity: 0.7545839210155149
Threshold: 0.14
Accuracy:  0.7589691763516928

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.60it/s]
Loss: 0.49375382628705766
AUROC: 0.8410984875519467
AUPRC: 0.6758013888586478
Sensitivity: 0.7740416946872899
Specificity: 0.7519675649892679
Threshold: 0.15
Accuracy:  0.7577464788732394

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0036.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
Loss: 0.5358265694230795
AUROC: 0.8377582344644487
AUPRC: 0.7228284647622987
Sensitivity: 0.7611408199643493
Specificity: 0.7581100141043724
Threshold: 0.15
Accuracy:  0.7589691763516928

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.61it/s]
Loss: 0.49122047424316406
AUROC: 0.834810026189292
AUPRC: 0.6699414567323183
Sensitivity: 0.7626092804303968
Specificity: 0.7510135940853804
Threshold: 0.16
Accuracy:  0.7540492957746479

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0037.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.64it/s]
Loss: 0.5285645686089993
AUROC: 0.8369449074667692
AUPRC: 0.7212059048450561
Sensitivity: 0.7557932263814616
Specificity: 0.7637517630465445
Threshold: 0.15
Accuracy:  0.7614957049014653

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.60it/s]
Loss: 0.490525621175766
AUROC: 0.8340313562601774
AUPRC: 0.6688699334008529
Sensitivity: 0.7579018157363819
Specificity: 0.7586453613164799
Threshold: 0.16
Accuracy:  0.7584507042253521

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0038.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.62it/s]
Loss: 0.5512189045548439
AUROC: 0.8370429592531974
AUPRC: 0.7218953791402735
Sensitivity: 0.7468805704099821
Specificity: 0.771509167842031
Threshold: 0.13
Accuracy:  0.7645275391611925

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.62it/s]
Loss: 0.5070824017127354
AUROC: 0.8340456305390015
AUPRC: 0.6682463358966753
Sensitivity: 0.7511768661735037
Specificity: 0.7636537085618889
Threshold: 0.14
Accuracy:  0.7603873239436619

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0039.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.64it/s]
Loss: 0.4529935624450445
AUROC: 0.8454477572539465
AUPRC: 0.7302100422999224
Sensitivity: 0.7575757575757576
Specificity: 0.7799717912552891
Threshold: 0.26
Accuracy:  0.7736230419403739

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.59it/s]
Loss: 0.43347161478466456
AUROC: 0.8418980877438316
AUPRC: 0.678151182102034
Sensitivity: 0.777404169468729
Specificity: 0.7557834486048175
Threshold: 0.26
Accuracy:  0.761443661971831

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0040.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
Loss: 0.4710105434060097
AUROC: 0.8456576886428375
AUPRC: 0.7317040762760372
Sensitivity: 0.7468805704099821
Specificity: 0.7722143864598026
Threshold: 0.23
Accuracy:  0.7650328448711471

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.61it/s]
Loss: 0.4434152927663591
AUROC: 0.8427423552014751
AUPRC: 0.6772286657400756
Sensitivity: 0.7700067249495629
Specificity: 0.7510135940853804
Threshold: 0.23
Accuracy:  0.7559859154929578

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0041.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.62it/s]
Loss: 0.5267639942467213
AUROC: 0.838889601230927
AUPRC: 0.7240731454071737
Sensitivity: 0.7486631016042781
Specificity: 0.768688293370945
Threshold: 0.16
Accuracy:  0.763011622031329

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.60it/s]
Loss: 0.4891811556286282
AUROC: 0.836463276370407
AUPRC: 0.6684470793164428
Sensitivity: 0.7545393409549428
Specificity: 0.7653231576436919
Threshold: 0.17
Accuracy:  0.7625

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0042.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.62it/s]
Loss: 0.4789970703423023
AUROC: 0.8434779220060893
AUPRC: 0.7313794830924032
Sensitivity: 0.7575757575757576
Specificity: 0.7679830747531735
Threshold: 0.23
Accuracy:  0.7650328448711471

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.62it/s]
Loss: 0.4445084234078725
AUROC: 0.8423426753943991
AUPRC: 0.6784187831472256
Sensitivity: 0.7484868863483524
Specificity: 0.7731934176007632
Threshold: 0.24
Accuracy:  0.766725352112676

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0043.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
Loss: 0.5296996142715216
AUROC: 0.8429361230323645
AUPRC: 0.7275973708401513
Sensitivity: 0.7593582887700535
Specificity: 0.7588152327221439
Threshold: 0.13
Accuracy:  0.7589691763516928

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.60it/s]
Loss: 0.4859501474433475
AUROC: 0.8396844518300025
AUPRC: 0.6747270316247584
Sensitivity: 0.7639542703429725
Specificity: 0.7560219413307894
Threshold: 0.14
Accuracy:  0.7580985915492958

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0044.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.62it/s]
Loss: 0.49235762283205986
AUROC: 0.8402497554990711
AUPRC: 0.7248902846719811
Sensitivity: 0.7522281639928698
Specificity: 0.7693935119887165
Threshold: 0.18
Accuracy:  0.7645275391611925

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.59it/s]
Loss: 0.458252861433559
AUROC: 0.8374817221067359
AUPRC: 0.6741248056094361
Sensitivity: 0.7579018157363819
Specificity: 0.7605533031242547
Threshold: 0.19
Accuracy:  0.7598591549295775

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0045.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.65it/s]
Loss: 0.4348919577896595
AUROC: 0.8458097946192196
AUPRC: 0.7305495196424624
Sensitivity: 0.7718360071301248
Specificity: 0.770098730606488
Threshold: 0.28
Accuracy:  0.7705912076806468

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.60it/s]
Loss: 0.4255447056558397
AUROC: 0.845593842878041
AUPRC: 0.6826523206219844
Sensitivity: 0.7726967047747142
Specificity: 0.7595993322203672
Threshold: 0.29
Accuracy:  0.7630281690140845

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0046.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.62it/s]
Loss: 0.4919668585062027
AUROC: 0.842289986901287
AUPRC: 0.7286256838612292
Sensitivity: 0.7522281639928698
Specificity: 0.7679830747531735
Threshold: 0.19
Accuracy:  0.7635169277412834

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.62it/s]
Loss: 0.4598423480987549
AUROC: 0.8390552608656532
AUPRC: 0.674938504037097
Sensitivity: 0.7605917955615333
Specificity: 0.761745766754114
Threshold: 0.2
Accuracy:  0.761443661971831

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0047.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
Loss: 0.44533923268318176
AUROC: 0.8447287108201402
AUPRC: 0.7303685982314378
Sensitivity: 0.7718360071301248
Specificity: 0.7510578279266573
Threshold: 0.25
Accuracy:  0.7569479535118747

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.60it/s]
Loss: 0.42539915409353046
AUROC: 0.841899370825074
AUPRC: 0.6774422255136995
Sensitivity: 0.7666442501681238
Specificity: 0.7622227522060577
Threshold: 0.26
Accuracy:  0.7633802816901408

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0048.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
Loss: 0.5312725212424994
AUROC: 0.8392126692964659
AUPRC: 0.7260566425168561
Sensitivity: 0.7647058823529411
Specificity: 0.764456981664316
Threshold: 0.13
Accuracy:  0.7645275391611925

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.61it/s]
Loss: 0.4851898034413656
AUROC: 0.8362518887356856
AUPRC: 0.6728843491711329
Sensitivity: 0.7666442501681238
Specificity: 0.755067970426902
Threshold: 0.14
Accuracy:  0.7580985915492958

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0049.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
Loss: 0.48527269065380096
AUROC: 0.8411592486719013
AUPRC: 0.7289920247651304
Sensitivity: 0.7593582887700535
Specificity: 0.764456981664316
Threshold: 0.19
Accuracy:  0.763011622031329

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.58it/s]
Loss: 0.44935712781217363
AUROC: 0.8383150031812395
AUPRC: 0.6765209126893463
Sensitivity: 0.7599193006052455
Specificity: 0.7553064631528739
Threshold: 0.2
Accuracy:  0.7565140845070423

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0050.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
Loss: 0.42936629243195057
AUROC: 0.8446947698171461
AUPRC: 0.7299169001882522
Sensitivity: 0.7700534759358288
Specificity: 0.7722143864598026
Threshold: 0.26
Accuracy:  0.7716018191005558

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.61it/s]
Loss: 0.4168587015734779
AUROC: 0.8410786799852638
AUPRC: 0.6794928328438335
Sensitivity: 0.7673167451244116
Specificity: 0.7655616503696637
Threshold: 0.27
Accuracy:  0.7660211267605633

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0051.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
Loss: 0.48233585990965366
AUROC: 0.8406784177961478
AUPRC: 0.7288538174582007
Sensitivity: 0.768270944741533
Specificity: 0.7468265162200282
Threshold: 0.19
Accuracy:  0.7529055078322385

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.60it/s]
Loss: 0.44966051512294347
AUROC: 0.838420296035712
AUPRC: 0.6753206662517297
Sensitivity: 0.7579018157363819
Specificity: 0.7576913904125924
Threshold: 0.21
Accuracy:  0.7577464788732394

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0052.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.62it/s]
Loss: 0.45077100582420826
AUROC: 0.84265956671167
AUPRC: 0.7295743920679834
Sensitivity: 0.7611408199643493
Specificity: 0.7743300423131171
Threshold: 0.25
Accuracy:  0.7705912076806468

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.62it/s]
Loss: 0.43477542334132724
AUROC: 0.8426573510691515
AUPRC: 0.6781268699641572
Sensitivity: 0.7787491593813046
Specificity: 0.7457667541139995
Threshold: 0.25
Accuracy:  0.7544014084507042

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0053.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.64it/s]
Loss: 0.4455848168581724
AUROC: 0.8454754128860161
AUPRC: 0.7299585541236803
Sensitivity: 0.7736185383244206
Specificity: 0.7545839210155149
Threshold: 0.22
Accuracy:  0.7599797877716018

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.60it/s]
Loss: 0.4212643477651808
AUROC: 0.8429303266035187
AUPRC: 0.6793008742355449
Sensitivity: 0.7626092804303968
Specificity: 0.7598378249463391
Threshold: 0.23
Accuracy:  0.7605633802816901

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0054.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
Loss: 0.4416801054030657
AUROC: 0.843230278391649
AUPRC: 0.7300285053732501
Sensitivity: 0.7611408199643493
Specificity: 0.7637517630465445
Threshold: 0.23
Accuracy:  0.763011622031329

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.60it/s]
Loss: 0.422067666053772
AUROC: 0.8400411484154506
AUPRC: 0.6784005358654072
Sensitivity: 0.7565568258238063
Specificity: 0.7629382303839732
Threshold: 0.24
Accuracy:  0.7612676056338028

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0055.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.64it/s]
Loss: 0.5068673342466354
AUROC: 0.8461831456521574
AUPRC: 0.7328700574469097
Sensitivity: 0.7557932263814616
Specificity: 0.7722143864598026
Threshold: 0.16
Accuracy:  0.7675593734209196

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.61it/s]
Loss: 0.4717240181234148
AUROC: 0.8420693790897212
AUPRC: 0.6778530372699362
Sensitivity: 0.7605917955615333
Specificity: 0.7648461721917481
Threshold: 0.17
Accuracy:  0.7637323943661972

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0056.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
Loss: 0.5066171158105135
AUROC: 0.8410605683483805
AUPRC: 0.7279891200790827
Sensitivity: 0.7629233511586453
Specificity: 0.7602256699576869
Threshold: 0.15
Accuracy:  0.7609903991915109

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.61it/s]
Loss: 0.46638516320122614
AUROC: 0.8381532547520918
AUPRC: 0.6754518532963473
Sensitivity: 0.7545393409549428
Specificity: 0.7658001430956356
Threshold: 0.17
Accuracy:  0.7628521126760563

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0057.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.62it/s]
Loss: 0.4278912916779518
AUROC: 0.8450618354791589
AUPRC: 0.7299316787222265
Sensitivity: 0.7754010695187166
Specificity: 0.765867418899859
Threshold: 0.28
Accuracy:  0.7685699848408287

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.60it/s]
Loss: 0.4199565874205695
AUROC: 0.8417101163417878
AUPRC: 0.6802830808463584
Sensitivity: 0.7726967047747142
Specificity: 0.7588838540424517
Threshold: 0.29
Accuracy:  0.7625

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0058.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
Loss: 0.43006088770926
AUROC: 0.8418210982302909
AUPRC: 0.7277915347275922
Sensitivity: 0.7611408199643493
Specificity: 0.771509167842031
Threshold: 0.25
Accuracy:  0.7685699848408287

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.62it/s]
Loss: 0.414148877064387
AUROC: 0.8398212603674969
AUPRC: 0.6794122618403584
Sensitivity: 0.7626092804303968
Specificity: 0.7653231576436919
Threshold: 0.26
Accuracy:  0.764612676056338

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0059.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.62it/s]
Loss: 0.45435483753681183
AUROC: 0.841660192734614
AUPRC: 0.7280342686584974
Sensitivity: 0.7629233511586453
Specificity: 0.7496473906911142
Threshold: 0.2
Accuracy:  0.7534108135421931

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.59it/s]
Loss: 0.42819860544469623
AUROC: 0.8396102736956637
AUPRC: 0.6796269051952826
Sensitivity: 0.7592468056489576
Specificity: 0.7581683758645361
Threshold: 0.22
Accuracy:  0.7584507042253521

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0060.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
Loss: 0.4432645197957754
AUROC: 0.8456803159781672
AUPRC: 0.7288631869644954
Sensitivity: 0.7540106951871658
Specificity: 0.7813822284908322
Threshold: 0.24
Accuracy:  0.7736230419403739

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.62it/s]
Loss: 0.4222892983092202
AUROC: 0.8418152488111049
AUPRC: 0.6798121123985505
Sensitivity: 0.7726967047747142
Specificity: 0.746720725017887
Threshold: 0.24
Accuracy:  0.7535211267605634

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0061.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.64it/s]
Loss: 0.5037452261894941
AUROC: 0.8387990918896088
AUPRC: 0.7261072860192846
Sensitivity: 0.7700534759358288
Specificity: 0.763046544428773
Threshold: 0.12
Accuracy:  0.7650328448711471

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.61it/s]
Loss: 0.4707975533273485
AUROC: 0.8336043307841183
AUPRC: 0.6741873335459919
Sensitivity: 0.7518493611297915
Specificity: 0.7643691867398045
Threshold: 0.14
Accuracy:  0.7610915492957746

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0062.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
Loss: 0.4754335395991802
AUROC: 0.8395294519910799
AUPRC: 0.7271185328161258
Sensitivity: 0.7593582887700535
Specificity: 0.7574047954866009
Threshold: 0.19
Accuracy:  0.7579585649317837

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.62it/s]
Loss: 0.44881366226408215
AUROC: 0.8371728202975754
AUPRC: 0.675633945262236
Sensitivity: 0.7646267652992602
Specificity: 0.7531600286191271
Threshold: 0.2
Accuracy:  0.7561619718309859

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0063.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.64it/s]
Loss: 0.4576670750975609
AUROC: 0.8395432798071145
AUPRC: 0.7284977413142971
Sensitivity: 0.7629233511586453
Specificity: 0.7574047954866009
Threshold: 0.2
Accuracy:  0.7589691763516928

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.61it/s]
Loss: 0.43181887533929614
AUROC: 0.836210990521077
AUPRC: 0.6734486736432155
Sensitivity: 0.7612642905178211
Specificity: 0.7593608394943954
Threshold: 0.22
Accuracy:  0.7598591549295775

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0064.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
Loss: 0.5259587559849024
AUROC: 0.8397217843413811
AUPRC: 0.7252091873703902
Sensitivity: 0.7540106951871658
Specificity: 0.7771509167842031
Threshold: 0.11
Accuracy:  0.7705912076806468

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.59it/s]
Loss: 0.4850300365024143
AUROC: 0.836550044739439
AUPRC: 0.6755699273951086
Sensitivity: 0.7612642905178211
Specificity: 0.766754113999523
Threshold: 0.12
Accuracy:  0.7653169014084507

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0065.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.64it/s]
Loss: 0.46631578728556633
AUROC: 0.8398701190952083
AUPRC: 0.7280804105019303
Sensitivity: 0.7557932263814616
Specificity: 0.7602256699576869
Threshold: 0.22
Accuracy:  0.7589691763516928

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.60it/s]
Loss: 0.44018086393674216
AUROC: 0.8375665658539041
AUPRC: 0.6759555064546798
Sensitivity: 0.7626092804303968
Specificity: 0.7560219413307894
Threshold: 0.23
Accuracy:  0.7577464788732394

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0066.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
Loss: 0.5049596428871155
AUROC: 0.8401869017898222
AUPRC: 0.7266084931090515
Sensitivity: 0.7664884135472371
Specificity: 0.7693935119887165
Threshold: 0.12
Accuracy:  0.7685699848408287

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.62it/s]
Loss: 0.4681925117969513
AUROC: 0.8365065803623453
AUPRC: 0.6744748561305868
Sensitivity: 0.769334229993275
Specificity: 0.7593608394943954
Threshold: 0.13
Accuracy:  0.7619718309859155

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0067.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
Loss: 0.466942198574543
AUROC: 0.8383553447023122
AUPRC: 0.7243870027993513
Sensitivity: 0.7611408199643493
Specificity: 0.7609308885754584
Threshold: 0.2
Accuracy:  0.7609903991915109

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.60it/s]
Loss: 0.4380565702915192
AUROC: 0.8340939064707551
AUPRC: 0.6747481202404872
Sensitivity: 0.7491593813046402
Specificity: 0.7646076794657763
Threshold: 0.22
Accuracy:  0.7605633802816901

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0068.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
Loss: 0.46152629144489765
AUROC: 0.8404534015170372
AUPRC: 0.7262271883049249
Sensitivity: 0.7450980392156863
Specificity: 0.7813822284908322
Threshold: 0.23
Accuracy:  0.7710965133906014

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.61it/s]
Loss: 0.43218313455581664
AUROC: 0.8399752301166112
AUPRC: 0.6772033416432561
Sensitivity: 0.7713517148621385
Specificity: 0.7498211304555211
Threshold: 0.23
Accuracy:  0.7554577464788732

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0069.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.64it/s]
Loss: 0.47605068050324917
AUROC: 0.8444772959831451
AUPRC: 0.7307047242963947
Sensitivity: 0.7611408199643493
Specificity: 0.7538787023977433
Threshold: 0.19
Accuracy:  0.7559373420919656

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.58it/s]
Loss: 0.4448750916454527
AUROC: 0.8432926366694034
AUPRC: 0.6780780898367041
Sensitivity: 0.7498318762609281
Specificity: 0.7641306940138326
Threshold: 0.21
Accuracy:  0.7603873239436619

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0070.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
Loss: 0.4942719303071499
AUROC: 0.8415194004258967
AUPRC: 0.7283333353955891
Sensitivity: 0.7540106951871658
Specificity: 0.771509167842031
Threshold: 0.16
Accuracy:  0.7665487620010106

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.62it/s]
Loss: 0.45621133148670195
AUROC: 0.8397484455069781
AUPRC: 0.6785167835963433
Sensitivity: 0.7659717552118359
Specificity: 0.7595993322203672
Threshold: 0.17
Accuracy:  0.7612676056338028

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0071.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.64it/s]
Loss: 0.46531506441533566
AUROC: 0.8419166358683492
AUPRC: 0.7292218579237656
Sensitivity: 0.7575757575757576
Specificity: 0.7722143864598026
Threshold: 0.18
Accuracy:  0.7680646791308742

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.60it/s]
Loss: 0.4338443977965249
AUROC: 0.8383040167980996
AUPRC: 0.678074795086282
Sensitivity: 0.7673167451244116
Specificity: 0.7555449558788457
Threshold: 0.19
Accuracy:  0.7586267605633803

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0072.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
Loss: 0.47504277527332306
AUROC: 0.8440825746890627
AUPRC: 0.732000729453055
Sensitivity: 0.7736185383244206
Specificity: 0.7609308885754584
Threshold: 0.16
Accuracy:  0.7645275391611925

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.62it/s]
Loss: 0.4460296173890432
AUROC: 0.8409309652572072
AUPRC: 0.6791825740728797
Sensitivity: 0.7605917955615333
Specificity: 0.7624612449320296
Threshold: 0.18
Accuracy:  0.7619718309859155

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0073.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.64it/s]
Loss: 0.5089998822659254
AUROC: 0.8387211532901404
AUPRC: 0.7259310219291608
Sensitivity: 0.7736185383244206
Specificity: 0.7574047954866009
Threshold: 0.14
Accuracy:  0.7620010106114199

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.60it/s]
Loss: 0.4728281706571579
AUROC: 0.8336212514180052
AUPRC: 0.6722788632341312
Sensitivity: 0.7531943510423672
Specificity: 0.763415215835917
Threshold: 0.16
Accuracy:  0.7607394366197183

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0074.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
Loss: 0.43936046585440636
AUROC: 0.847205146964543
AUPRC: 0.7312603347439823
Sensitivity: 0.7700534759358288
Specificity: 0.7595204513399154
Threshold: 0.28
Accuracy:  0.7625063163213744

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.60it/s]
Loss: 0.4349896510442098
AUROC: 0.8443713872241356
AUPRC: 0.6782477264383983
Sensitivity: 0.7686617350369872
Specificity: 0.7548294777009301
Threshold: 0.29
Accuracy:  0.7584507042253521

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0075.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.64it/s]
Loss: 0.5362247955054045
AUROC: 0.838528820939839
AUPRC: 0.7228202315520003
Sensitivity: 0.7611408199643493
Specificity: 0.770098730606488
Threshold: 0.12
Accuracy:  0.7675593734209196

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.60it/s]
Loss: 0.4913577569855584
AUROC: 0.8352372922430842
AUPRC: 0.6746070031561469
Sensitivity: 0.7619367854741089
Specificity: 0.7641306940138326
Threshold: 0.13
Accuracy:  0.763556338028169

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0076.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
Loss: 0.46201276406645775
AUROC: 0.8469725882403225
AUPRC: 0.7330496198058515
Sensitivity: 0.7736185383244206
Specificity: 0.7581100141043724
Threshold: 0.2
Accuracy:  0.7625063163213744

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.62it/s]
Loss: 0.4341776586241192
AUROC: 0.8419660910496904
AUPRC: 0.6798346436175855
Sensitivity: 0.7700067249495629
Specificity: 0.7505366086334366
Threshold: 0.21
Accuracy:  0.7556338028169014

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0077.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
Loss: 0.48223480582237244
AUROC: 0.8398952605789077
AUPRC: 0.7283688537329684
Sensitivity: 0.7593582887700535
Specificity: 0.7665726375176305
Threshold: 0.18
Accuracy:  0.7645275391611925

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.61it/s]
Loss: 0.4503435148133172
AUROC: 0.8369081847912851
AUPRC: 0.6773899141739669
Sensitivity: 0.7626092804303968
Specificity: 0.7560219413307894
Threshold: 0.19
Accuracy:  0.7577464788732394

Intermediate Model:
  ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0078.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.63it/s]
Loss: 0.426361421123147
AUROC: 0.8423804962426052
AUPRC: 0.7276975216247239
Sensitivity: 0.7647058823529411
Specificity: 0.770098730606488
Threshold: 0.27
Accuracy:  0.7685699848408287

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.62it/s]
Loss: 0.4167256166537603
AUROC: 0.8394632807008061
AUPRC: 0.6785430229601767
Sensitivity: 0.7592468056489576
Specificity: 0.7641306940138326
Threshold: 0.28
Accuracy:  0.7628521126760563


Plot AUROC/AUPRC for Each Intermediate Model
  Epoch with best Validation Loss:      58, 0.4259
  Epoch with best model Test AUROC:     45, 0.8456
  Epoch with best model Test Accuracy:   1, 0.7789

AUROC/AUPRC Plots - Best Model Based on Validation Loss
  Epoch with best Validation Loss:   58, 0.4259
  Best Model Based on Validation Loss:
    ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0058.model

Generate Stats Based on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.58it/s]
Loss: 0.414148877064387
AUROC: 0.8398212603674969
AUPRC: 0.6794122618403584
Sensitivity: 0.7626092804303968
Specificity: 0.7653231576436919
Threshold: 0.26
Accuracy:  0.764612676056338
best_model_val_test_auroc: 0.8398212603674969
best_model_val_test_auprc: 0.6794122618403584

AUROC/AUPRC Plots - Best Model Based on Model AUROC
  Epoch with best model Test AUROC:  45, 0.8456
  Best Model Based on Model AUROC:
    ./vitaldb_cache/models/ABP_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_91159da4_0045.model

Generate Stats Based on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.62it/s]
Loss: 0.4255447056558397
AUROC: 0.845593842878041
AUPRC: 0.6826523206219844
Sensitivity: 0.7726967047747142
Specificity: 0.7595993322203672
Threshold: 0.29
Accuracy:  0.7630281690140845
best_model_auroc_test_auroc: 0.845593842878041
best_model_auroc_test_auprc: 0.6826523206219844

Total Processing Time: 5925.3200 sec
In [106]:
RUN_ME = False
DISPLAY_MODEL_PREDICTION=True
DISPLAY_MODEL_PREDICTION_FIRST_ONLY=True

if MULTI_RUN and RUN_ME:
    (model, best_model_val_loss, best_model_auroc, experimentName) = run_experiment(
        experimentNamePrefix=None, 
        useAbp=False, 
        useEeg=False, 
        useEcg=True,
        nResiduals=12, 
        skip_connection=False,
        batch_size=128,
        learning_rate=1e-4,
        weight_decay=1e-1,
        balance_labels=False,
        #pos_weight=2.0,
        pos_weight=None,
        max_epochs=MAX_EPOCHS,
        patience=PATIENCE,
        device=device
    )
    
    if DISPLAY_MODEL_PREDICTION:
        for case_id_to_check in my_cases_of_interest_idx:
            preds = predictionsForModel(case_id_to_check, model, best_model_val_loss, device)
            printModelPrediction(case_id_to_check, positiveSegmentsMap, 
                            negativeSegmentsMap, iohEventsMap, cleanEventsMap, preds, experimentName)

            if DISPLAY_MODEL_PREDICTION_FIRST_ONLY:
                break
In [107]:
RUN_ME = False
DISPLAY_MODEL_PREDICTION=True
DISPLAY_MODEL_PREDICTION_FIRST_ONLY=True

if MULTI_RUN and RUN_ME:
    (model, best_model_val_loss, best_model_auroc, experimentName) = run_experiment(
        experimentNamePrefix=None, 
        useAbp=False, 
        useEeg=True, 
        useEcg=False,
        nResiduals=12, 
        skip_connection=False,
        batch_size=128,
        learning_rate=1e-4,
        weight_decay=1e-1,
        balance_labels=False,
        #pos_weight=2.0,
        pos_weight=None,
        max_epochs=MAX_EPOCHS,
        patience=PATIENCE,
        device=device
    )
    
    if DISPLAY_MODEL_PREDICTION:
        for case_id_to_check in my_cases_of_interest_idx:
            preds = predictionsForModel(case_id_to_check, model, best_model_val_loss, device)
            printModelPrediction(case_id_to_check, positiveSegmentsMap, 
                            negativeSegmentsMap, iohEventsMap, cleanEventsMap, preds, experimentName)

            if DISPLAY_MODEL_PREDICTION_FIRST_ONLY:
                break
In [108]:
RUN_ME = True
DISPLAY_MODEL_PREDICTION=True
DISPLAY_MODEL_PREDICTION_FIRST_ONLY=True

if MULTI_RUN and RUN_ME:
    (model, best_model_val_loss, best_model_auroc, experimentName) = run_experiment(
        experimentNamePrefix=None, 
        useAbp=True, 
        useEeg=False, 
        useEcg=True,
        nResiduals=12, 
        skip_connection=False,
        batch_size=128,
        learning_rate=1e-4,
        weight_decay=1e-1,
        balance_labels=False,
        #pos_weight=2.0,
        pos_weight=None,
        max_epochs=MAX_EPOCHS,
        patience=PATIENCE,
        device=device
    )
    
    if DISPLAY_MODEL_PREDICTION:
        for case_id_to_check in my_cases_of_interest_idx:
            preds = predictionsForModel(case_id_to_check, model, best_model_val_loss, device)
            printModelPrediction(case_id_to_check, positiveSegmentsMap, 
                            negativeSegmentsMap, iohEventsMap, cleanEventsMap, preds, experimentName)

            if DISPLAY_MODEL_PREDICTION_FIRST_ONLY:
                break
Experiment Setup
  name:              ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb
  prediction_window: 003
  max_cases:         _ALL
  use_abp:           True
  use_eeg:           False
  use_ecg:           True
  n_residuals:       12
  skip_connection:   False
  batch_size:        128
  learning_rate:     0.0001
  weight_decay:      0.1
  balance_labels:    False
  max_epochs:        200
  patience:          20
  device:            mps

Model Architecture
HypotensionCNN(
  (abpResiduals): Sequential(
    (0): ResidualBlock(
      (bn1): BatchNorm1d(1, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(1, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (bn2): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (residualConv): Conv1d(1, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    )
    (1): ResidualBlock(
      (bn1): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (bn2): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (residualConv): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
    )
    (2): ResidualBlock(
      (bn1): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (bn2): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (residualConv): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    )
    (3): ResidualBlock(
      (bn1): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (bn2): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (residualConv): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
    )
    (4): ResidualBlock(
      (bn1): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (bn2): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (residualConv): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    )
    (5): ResidualBlock(
      (bn1): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(2, 4, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (bn2): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(4, 4, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (residualConv): Conv1d(2, 4, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
    )
    (6): ResidualBlock(
      (bn1): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    )
    (7): ResidualBlock(
      (bn1): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
    )
    (8): ResidualBlock(
      (bn1): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=1, dilation=1, ceil_mode=False)
    )
    (9): ResidualBlock(
      (bn1): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
    )
    (10): ResidualBlock(
      (bn1): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(4, 6, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(6, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(6, 6, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(4, 6, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    )
    (11): ResidualBlock(
      (bn1): BatchNorm1d(6, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(6, 6, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(6, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(6, 6, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(6, 6, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
    )
  )
  (abpFc): Linear(in_features=2814, out_features=32, bias=True)
  (ecgResiduals): Sequential(
    (0): ResidualBlock(
      (bn1): BatchNorm1d(1, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(1, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (bn2): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (residualConv): Conv1d(1, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    )
    (1): ResidualBlock(
      (bn1): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (bn2): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (residualConv): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
    )
    (2): ResidualBlock(
      (bn1): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (bn2): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (residualConv): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    )
    (3): ResidualBlock(
      (bn1): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (bn2): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (residualConv): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
    )
    (4): ResidualBlock(
      (bn1): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (bn2): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (residualConv): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    )
    (5): ResidualBlock(
      (bn1): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(2, 4, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (bn2): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(4, 4, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (residualConv): Conv1d(2, 4, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
    )
    (6): ResidualBlock(
      (bn1): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    )
    (7): ResidualBlock(
      (bn1): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
    )
    (8): ResidualBlock(
      (bn1): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=1, dilation=1, ceil_mode=False)
    )
    (9): ResidualBlock(
      (bn1): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
    )
    (10): ResidualBlock(
      (bn1): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(4, 6, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(6, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(6, 6, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(4, 6, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    )
    (11): ResidualBlock(
      (bn1): BatchNorm1d(6, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(6, 6, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(6, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(6, 6, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(6, 6, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
    )
  )
  (ecgFc): Linear(in_features=2814, out_features=32, bias=True)
  (fullLinear1): Linear(in_features=64, out_features=16, bias=True)
  (fullLinear2): Linear(in_features=16, out_features=1, bias=True)
  (sigmoid): Sigmoid()
)

Training Loop
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:54<00:00,  1.67it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.55it/s]
[2024-05-05 05:32:57.156482] Completed epoch 0 with training loss 0.50931692, validation loss 0.66338497
Validation loss improved to 0.66338497. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:54<00:00,  1.68it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.53it/s]
[2024-05-05 05:33:58.460675] Completed epoch 1 with training loss 0.44562477, validation loss 0.63745475
Validation loss improved to 0.63745475. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:54<00:00,  1.68it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.53it/s]
[2024-05-05 05:34:59.788626] Completed epoch 2 with training loss 0.44022125, validation loss 0.62522602
Validation loss improved to 0.62522602. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:54<00:00,  1.68it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.52it/s]
[2024-05-05 05:36:00.979688] Completed epoch 3 with training loss 0.43809125, validation loss 0.61594057
Validation loss improved to 0.61594057. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:54<00:00,  1.68it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.54it/s]
[2024-05-05 05:37:02.090540] Completed epoch 4 with training loss 0.43918747, validation loss 0.57035375
Validation loss improved to 0.57035375. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:54<00:00,  1.68it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.52it/s]
[2024-05-05 05:38:03.302451] Completed epoch 5 with training loss 0.44046849, validation loss 0.60301912
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:54<00:00,  1.68it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.51it/s]
[2024-05-05 05:39:04.418554] Completed epoch 6 with training loss 0.43801224, validation loss 0.56776011
Validation loss improved to 0.56776011. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:54<00:00,  1.68it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.53it/s]
[2024-05-05 05:40:05.759507] Completed epoch 7 with training loss 0.43589973, validation loss 0.58155072
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:54<00:00,  1.68it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.54it/s]
[2024-05-05 05:41:06.798103] Completed epoch 8 with training loss 0.43600658, validation loss 0.59658462
No improvement in validation loss. 2 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:55<00:00,  1.65it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
[2024-05-05 05:42:08.973360] Completed epoch 9 with training loss 0.43488139, validation loss 0.61641479
No improvement in validation loss. 3 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:55<00:00,  1.65it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
[2024-05-05 05:43:11.173076] Completed epoch 10 with training loss 0.43644515, validation loss 0.56186622
Validation loss improved to 0.56186622. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:55<00:00,  1.65it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
[2024-05-05 05:44:13.422933] Completed epoch 11 with training loss 0.43534786, validation loss 0.61560130
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:55<00:00,  1.65it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
[2024-05-05 05:45:15.706105] Completed epoch 12 with training loss 0.43481529, validation loss 0.55370551
Validation loss improved to 0.55370551. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:55<00:00,  1.66it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
[2024-05-05 05:46:17.812688] Completed epoch 13 with training loss 0.43672678, validation loss 0.54498243
Validation loss improved to 0.54498243. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:55<00:00,  1.66it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.49it/s]
[2024-05-05 05:47:19.929417] Completed epoch 14 with training loss 0.43418828, validation loss 0.56581098
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:55<00:00,  1.65it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
[2024-05-05 05:48:22.154523] Completed epoch 15 with training loss 0.43404964, validation loss 0.52756894
Validation loss improved to 0.52756894. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:55<00:00,  1.66it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
[2024-05-05 05:49:24.128030] Completed epoch 16 with training loss 0.43441722, validation loss 0.57191592
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:55<00:00,  1.65it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
[2024-05-05 05:50:26.301323] Completed epoch 17 with training loss 0.43543032, validation loss 0.56071311
No improvement in validation loss. 2 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:55<00:00,  1.65it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
[2024-05-05 05:51:28.551762] Completed epoch 18 with training loss 0.43559286, validation loss 0.58048642
No improvement in validation loss. 3 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:55<00:00,  1.66it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
[2024-05-05 05:52:30.652412] Completed epoch 19 with training loss 0.43408766, validation loss 0.51608884
Validation loss improved to 0.51608884. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:55<00:00,  1.65it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
[2024-05-05 05:53:32.859338] Completed epoch 20 with training loss 0.43697909, validation loss 0.54622805
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:55<00:00,  1.66it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
[2024-05-05 05:54:34.925464] Completed epoch 21 with training loss 0.43206429, validation loss 0.53771794
No improvement in validation loss. 2 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:55<00:00,  1.65it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
[2024-05-05 05:55:37.024499] Completed epoch 22 with training loss 0.43020865, validation loss 0.52753699
No improvement in validation loss. 3 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:55<00:00,  1.65it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.46it/s]
[2024-05-05 05:56:39.274723] Completed epoch 23 with training loss 0.43256110, validation loss 0.53897625
No improvement in validation loss. 4 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:55<00:00,  1.66it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
[2024-05-05 05:57:41.223895] Completed epoch 24 with training loss 0.42943951, validation loss 0.53592104
No improvement in validation loss. 5 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.61it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
[2024-05-05 05:58:44.943065] Completed epoch 25 with training loss 0.42935792, validation loss 0.51994187
No improvement in validation loss. 6 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:55<00:00,  1.64it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.38it/s]
[2024-05-05 05:59:47.666750] Completed epoch 26 with training loss 0.43107513, validation loss 0.50279522
Validation loss improved to 0.50279522. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:55<00:00,  1.64it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.44it/s]
[2024-05-05 06:00:50.334765] Completed epoch 27 with training loss 0.43063745, validation loss 0.56057930
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.64it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.41it/s]
[2024-05-05 06:01:53.167646] Completed epoch 28 with training loss 0.42962074, validation loss 0.49027488
Validation loss improved to 0.49027488. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.64it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.43it/s]
[2024-05-05 06:02:55.998211] Completed epoch 29 with training loss 0.43107131, validation loss 0.49681234
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.63it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.43it/s]
[2024-05-05 06:03:58.947470] Completed epoch 30 with training loss 0.42899224, validation loss 0.51841319
No improvement in validation loss. 2 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.64it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.43it/s]
[2024-05-05 06:05:01.772733] Completed epoch 31 with training loss 0.42975691, validation loss 0.48005632
Validation loss improved to 0.48005632. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.63it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.41it/s]
[2024-05-05 06:06:04.923110] Completed epoch 32 with training loss 0.42759585, validation loss 0.60481209
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.63it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.45it/s]
[2024-05-05 06:07:07.856875] Completed epoch 33 with training loss 0.43045214, validation loss 0.55559206
No improvement in validation loss. 2 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.64it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.42it/s]
[2024-05-05 06:08:10.669534] Completed epoch 34 with training loss 0.42833298, validation loss 0.47589540
Validation loss improved to 0.47589540. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.63it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.42it/s]
[2024-05-05 06:09:13.779391] Completed epoch 35 with training loss 0.42795521, validation loss 0.48876894
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.64it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.44it/s]
[2024-05-05 06:10:16.672897] Completed epoch 36 with training loss 0.42826739, validation loss 0.50017405
No improvement in validation loss. 2 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.63it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.42it/s]
[2024-05-05 06:11:19.638544] Completed epoch 37 with training loss 0.42946920, validation loss 0.48882169
No improvement in validation loss. 3 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.63it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.44it/s]
[2024-05-05 06:12:22.535518] Completed epoch 38 with training loss 0.42858467, validation loss 0.48085442
No improvement in validation loss. 4 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.63it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.42it/s]
[2024-05-05 06:13:25.679317] Completed epoch 39 with training loss 0.42875999, validation loss 0.55680132
No improvement in validation loss. 5 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.63it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.41it/s]
[2024-05-05 06:14:28.864343] Completed epoch 40 with training loss 0.43116814, validation loss 0.49253851
No improvement in validation loss. 6 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.63it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.41it/s]
[2024-05-05 06:15:31.905362] Completed epoch 41 with training loss 0.42938474, validation loss 0.49758023
No improvement in validation loss. 7 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.63it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.43it/s]
[2024-05-05 06:16:35.065256] Completed epoch 42 with training loss 0.42796165, validation loss 0.50469935
No improvement in validation loss. 8 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.64it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.42it/s]
[2024-05-05 06:17:37.995972] Completed epoch 43 with training loss 0.42874578, validation loss 0.47806063
No improvement in validation loss. 9 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.63it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.42it/s]
[2024-05-05 06:18:41.264725] Completed epoch 44 with training loss 0.42703858, validation loss 0.45439634
Validation loss improved to 0.45439634. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.63it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.40it/s]
[2024-05-05 06:19:44.545323] Completed epoch 45 with training loss 0.43042693, validation loss 0.47321996
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.63it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.44it/s]
[2024-05-05 06:20:47.521967] Completed epoch 46 with training loss 0.42433208, validation loss 0.48712865
No improvement in validation loss. 2 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.64it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.42it/s]
[2024-05-05 06:21:50.463806] Completed epoch 47 with training loss 0.43155554, validation loss 0.53139716
No improvement in validation loss. 3 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.64it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.42it/s]
[2024-05-05 06:22:53.306297] Completed epoch 48 with training loss 0.42789552, validation loss 0.49914482
No improvement in validation loss. 4 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.63it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.40it/s]
[2024-05-05 06:23:56.478953] Completed epoch 49 with training loss 0.42719653, validation loss 0.42161432
Validation loss improved to 0.42161432. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.63it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.44it/s]
[2024-05-05 06:24:59.487005] Completed epoch 50 with training loss 0.42708781, validation loss 0.47330469
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.63it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.42it/s]
[2024-05-05 06:26:02.548083] Completed epoch 51 with training loss 0.42731380, validation loss 0.50128824
No improvement in validation loss. 2 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.63it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.41it/s]
[2024-05-05 06:27:05.695891] Completed epoch 52 with training loss 0.42553744, validation loss 0.48117837
No improvement in validation loss. 3 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.63it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.41it/s]
[2024-05-05 06:28:08.897441] Completed epoch 53 with training loss 0.42687735, validation loss 0.43190765
No improvement in validation loss. 4 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.64it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.42it/s]
[2024-05-05 06:29:11.677387] Completed epoch 54 with training loss 0.42580065, validation loss 0.44173911
No improvement in validation loss. 5 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.63it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.44it/s]
[2024-05-05 06:30:14.751203] Completed epoch 55 with training loss 0.42646205, validation loss 0.45044124
No improvement in validation loss. 6 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.63it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.43it/s]
[2024-05-05 06:31:17.727350] Completed epoch 56 with training loss 0.42819786, validation loss 0.45594218
No improvement in validation loss. 7 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.63it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.41it/s]
[2024-05-05 06:32:20.793263] Completed epoch 57 with training loss 0.42727610, validation loss 0.46248233
No improvement in validation loss. 8 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.64it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.44it/s]
[2024-05-05 06:33:23.662221] Completed epoch 58 with training loss 0.42623559, validation loss 0.46479762
No improvement in validation loss. 9 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.63it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.43it/s]
[2024-05-05 06:34:26.604122] Completed epoch 59 with training loss 0.42640093, validation loss 0.45385152
No improvement in validation loss. 10 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.63it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.44it/s]
[2024-05-05 06:35:29.526624] Completed epoch 60 with training loss 0.42367733, validation loss 0.43918258
No improvement in validation loss. 11 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.64it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.40it/s]
[2024-05-05 06:36:32.502193] Completed epoch 61 with training loss 0.42631733, validation loss 0.48482808
No improvement in validation loss. 12 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.64it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.43it/s]
[2024-05-05 06:37:35.270711] Completed epoch 62 with training loss 0.42597991, validation loss 0.47261161
No improvement in validation loss. 13 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.63it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.43it/s]
[2024-05-05 06:38:38.402689] Completed epoch 63 with training loss 0.42396623, validation loss 0.44880000
No improvement in validation loss. 14 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.63it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.40it/s]
[2024-05-05 06:39:41.477278] Completed epoch 64 with training loss 0.42225036, validation loss 0.45677775
No improvement in validation loss. 15 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.63it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.42it/s]
[2024-05-05 06:40:44.729319] Completed epoch 65 with training loss 0.42511505, validation loss 0.44096351
No improvement in validation loss. 16 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.63it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.43it/s]
[2024-05-05 06:41:47.737903] Completed epoch 66 with training loss 0.42168647, validation loss 0.43884537
No improvement in validation loss. 17 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.63it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.44it/s]
[2024-05-05 06:42:50.645380] Completed epoch 67 with training loss 0.42758730, validation loss 0.46020946
No improvement in validation loss. 18 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.63it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.42it/s]
[2024-05-05 06:43:53.770711] Completed epoch 68 with training loss 0.42547849, validation loss 0.44858268
No improvement in validation loss. 19 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.64it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.42it/s]
[2024-05-05 06:44:56.613238] Completed epoch 69 with training loss 0.42440519, validation loss 0.45291087
No improvement in validation loss. 20 epochs without improvement.
Early stopping due to no improvement in validation loss.

Plot Validation and Loss Values from Training
  Epoch with best Validation Loss:   49, 0.4216
Generate AUROC/AUPRC for Each Intermediate Model

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0000.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:05<00:00,  2.73it/s]
Loss: 0.6585641540586948
AUROC: 0.8394037445725823
AUPRC: 0.7147438863663644
Sensitivity: 0.7843137254901961
Specificity: 0.7404795486600846
Threshold: 0.09
Accuracy:  0.7529055078322385

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.36it/s]
Loss: 0.6095466564098994
AUROC: 0.833443464473325
AUPRC: 0.6703070707225908
Sensitivity: 0.7377269670477471
Specificity: 0.7925113284044837
Threshold: 0.1
Accuracy:  0.778169014084507

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0001.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
Loss: 0.6394929029047489
AUROC: 0.8425401446640972
AUPRC: 0.7239002788371881
Sensitivity: 0.7771836007130125
Specificity: 0.7489421720733427
Threshold: 0.1
Accuracy:  0.7569479535118747

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.44it/s]
Loss: 0.5840489817990198
AUROC: 0.8383946344108596
AUPRC: 0.6785149955028736
Sensitivity: 0.7397444519166106
Specificity: 0.7917958502265681
Threshold: 0.11
Accuracy:  0.778169014084507

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0002.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.46it/s]
Loss: 0.6237076744437218
AUROC: 0.8427639038690229
AUPRC: 0.7264595069220786
Sensitivity: 0.7540106951871658
Specificity: 0.7743300423131171
Threshold: 0.11
Accuracy:  0.7685699848408287

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.45it/s]
Loss: 0.5732368859979842
AUROC: 0.8384757892994552
AUPRC: 0.680392693879719
Sensitivity: 0.7720242098184263
Specificity: 0.7581683758645361
Threshold: 0.11
Accuracy:  0.7617957746478873

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0003.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
Loss: 0.6199825331568718
AUROC: 0.8428343000233817
AUPRC: 0.7273506074013926
Sensitivity: 0.7878787878787878
Specificity: 0.7376586741889986
Threshold: 0.1
Accuracy:  0.7518948964123294

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.45it/s]
Loss: 0.5731531116697524
AUROC: 0.8385508495521485
AUPRC: 0.6808286667741941
Sensitivity: 0.7545393409549428
Specificity: 0.7765323157643692
Threshold: 0.11
Accuracy:  0.770774647887324

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0004.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
Loss: 0.5774903893470764
AUROC: 0.8432315354658338
AUPRC: 0.7275509376859409
Sensitivity: 0.7629233511586453
Specificity: 0.764456981664316
Threshold: 0.14
Accuracy:  0.764022233451238

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.44it/s]
Loss: 0.5314997480975256
AUROC: 0.8388449157344413
AUPRC: 0.6803536055911966
Sensitivity: 0.7800941492938803
Specificity: 0.7495826377295493
Threshold: 0.14
Accuracy:  0.7575704225352112

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0005.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
Loss: 0.6028499491512775
AUROC: 0.8431548539405506
AUPRC: 0.7287477239893152
Sensitivity: 0.7433155080213903
Specificity: 0.7863187588152327
Threshold: 0.12
Accuracy:  0.7741283476503285

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.46it/s]
Loss: 0.5554758860005273
AUROC: 0.838275789010762
AUPRC: 0.6801019487824292
Sensitivity: 0.7652992602555481
Specificity: 0.7660386358216075
Threshold: 0.12
Accuracy:  0.7658450704225352

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0006.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.49it/s]
Loss: 0.5697213597595692
AUROC: 0.8431007997505964
AUPRC: 0.7275836519214954
Sensitivity: 0.7611408199643493
Specificity: 0.771509167842031
Threshold: 0.14
Accuracy:  0.7685699848408287

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.43it/s]
Loss: 0.5273199492030674
AUROC: 0.8384326456926723
AUPRC: 0.6797128768257327
Sensitivity: 0.7753866845998655
Specificity: 0.753398521345099
Threshold: 0.14
Accuracy:  0.7591549295774648

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0007.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
Loss: 0.5786867663264275
AUROC: 0.8427639038690229
AUPRC: 0.7274268432077837
Sensitivity: 0.7557932263814616
Specificity: 0.7764456981664316
Threshold: 0.13
Accuracy:  0.7705912076806468

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.46it/s]
Loss: 0.5359080261654324
AUROC: 0.8376085867646
AUPRC: 0.6790936825112972
Sensitivity: 0.7747141896435776
Specificity: 0.7553064631528739
Threshold: 0.13
Accuracy:  0.7603873239436619

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0008.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
Loss: 0.593440119177103
AUROC: 0.8421466804441998
AUPRC: 0.7271603301350988
Sensitivity: 0.7522281639928698
Specificity: 0.7785613540197461
Threshold: 0.12
Accuracy:  0.7710965133906014

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.45it/s]
Loss: 0.5492225872145758
AUROC: 0.8369276715876575
AUPRC: 0.6786100512740687
Sensitivity: 0.7747141896435776
Specificity: 0.7538755067970427
Threshold: 0.12
Accuracy:  0.759330985915493

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0009.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.45it/s]
Loss: 0.62555031478405
AUROC: 0.841447747197353
AUPRC: 0.7269476308465321
Sensitivity: 0.7629233511586453
Specificity: 0.767277856135402
Threshold: 0.1
Accuracy:  0.7660434562910561

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.46it/s]
Loss: 0.5696400086085002
AUROC: 0.8359267880258368
AUPRC: 0.6781563998147443
Sensitivity: 0.7868190988567586
Specificity: 0.7381349868829
Threshold: 0.1
Accuracy:  0.7508802816901409

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0010.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
Loss: 0.5678100753575563
AUROC: 0.8416249946574348
AUPRC: 0.7255560564101918
Sensitivity: 0.7522281639928698
Specificity: 0.7806770098730607
Threshold: 0.14
Accuracy:  0.7726124305204649

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.43it/s]
Loss: 0.5238151914543576
AUROC: 0.8361487610808099
AUPRC: 0.6779258279874619
Sensitivity: 0.7726967047747142
Specificity: 0.7519675649892679
Threshold: 0.14
Accuracy:  0.7573943661971831

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0011.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.46it/s]
Loss: 0.6062351558357477
AUROC: 0.8404496302944823
AUPRC: 0.7257831189053076
Sensitivity: 0.7415329768270945
Specificity: 0.7905500705218618
Threshold: 0.11
Accuracy:  0.7766548762001011

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.46it/s]
Loss: 0.5620404144128164
AUROC: 0.8350350465622163
AUPRC: 0.6770109785446444
Sensitivity: 0.7531943510423672
Specificity: 0.7641306940138326
Threshold: 0.11
Accuracy:  0.7612676056338028

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0012.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
Loss: 0.5587483812123537
AUROC: 0.841086966906265
AUPRC: 0.7256746940628384
Sensitivity: 0.7736185383244206
Specificity: 0.7425952045133991
Threshold: 0.14
Accuracy:  0.751389590702375

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.44it/s]
Loss: 0.5148461669683456
AUROC: 0.8354183670834489
AUPRC: 0.6771180669800939
Sensitivity: 0.7585743106926698
Specificity: 0.7619842594800859
Threshold: 0.15
Accuracy:  0.7610915492957746

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0013.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
Loss: 0.5432230290025473
AUROC: 0.8411510776896987
AUPRC: 0.7264179222160444
Sensitivity: 0.7629233511586453
Specificity: 0.7722143864598026
Threshold: 0.15
Accuracy:  0.7695805962607377

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.46it/s]
Loss: 0.503673791885376
AUROC: 0.8350004033686655
AUPRC: 0.6767375855329413
Sensitivity: 0.7444519166106254
Specificity: 0.7758168375864536
Threshold: 0.16
Accuracy:  0.7676056338028169

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0014.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
Loss: 0.566126300022006
AUROC: 0.8404244888107827
AUPRC: 0.725851990790711
Sensitivity: 0.7611408199643493
Specificity: 0.7757404795486601
Threshold: 0.13
Accuracy:  0.7716018191005558

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.45it/s]
Loss: 0.525459447171953
AUROC: 0.8346112287892635
AUPRC: 0.675724871268262
Sensitivity: 0.7437794216543376
Specificity: 0.7777247793942285
Threshold: 0.14
Accuracy:  0.768838028169014

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0015.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.45it/s]
Loss: 0.5297484509646893
AUROC: 0.8398412063889539
AUPRC: 0.7248473060878339
Sensitivity: 0.7629233511586453
Specificity: 0.765867418899859
Threshold: 0.17
Accuracy:  0.7650328448711471

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.45it/s]
Loss: 0.4953435613049401
AUROC: 0.8337545314820823
AUPRC: 0.6754021948800969
Sensitivity: 0.7484868863483524
Specificity: 0.7677080849034105
Threshold: 0.18
Accuracy:  0.7626760563380282

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0016.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
Loss: 0.5712855085730553
AUROC: 0.8396853291900169
AUPRC: 0.7244904320954157
Sensitivity: 0.7754010695187166
Specificity: 0.7440056417489421
Threshold: 0.12
Accuracy:  0.7529055078322385

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.45it/s]
Loss: 0.5309532307916217
AUROC: 0.8346040916498516
AUPRC: 0.6747348780881206
Sensitivity: 0.7632817753866846
Specificity: 0.7553064631528739
Threshold: 0.13
Accuracy:  0.7573943661971831

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0017.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.45it/s]
Loss: 0.5638172384351492
AUROC: 0.8396702442997971
AUPRC: 0.7230129154200511
Sensitivity: 0.7754010695187166
Specificity: 0.7447108603667136
Threshold: 0.13
Accuracy:  0.7534108135421931

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.45it/s]
Loss: 0.5183212359746298
AUROC: 0.8347001623578928
AUPRC: 0.6746120796267745
Sensitivity: 0.7673167451244116
Specificity: 0.7541139995230145
Threshold: 0.14
Accuracy:  0.7575704225352112

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0018.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
Loss: 0.5754019264131784
AUROC: 0.8404584298137772
AUPRC: 0.7259862633870752
Sensitivity: 0.750445632798574
Specificity: 0.7813822284908322
Threshold: 0.12
Accuracy:  0.7726124305204649

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.45it/s]
Loss: 0.5305439952347014
AUROC: 0.8355073808446556
AUPRC: 0.6749470221309231
Sensitivity: 0.7780766644250168
Specificity: 0.74839017409969
Threshold: 0.12
Accuracy:  0.7561619718309859

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0019.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.45it/s]
Loss: 0.5178478695452213
AUROC: 0.8406344201996736
AUPRC: 0.7277410621147598
Sensitivity: 0.7664884135472371
Specificity: 0.7510578279266573
Threshold: 0.17
Accuracy:  0.7554320363820111

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.46it/s]
Loss: 0.4830167998870214
AUROC: 0.8343596646731326
AUPRC: 0.6746864370783334
Sensitivity: 0.7673167451244116
Specificity: 0.7495826377295493
Threshold: 0.18
Accuracy:  0.754225352112676

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0020.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
Loss: 0.5473580174148083
AUROC: 0.8411058230190397
AUPRC: 0.7267234730078798
Sensitivity: 0.7736185383244206
Specificity: 0.7524682651622003
Threshold: 0.14
Accuracy:  0.7584638706417383

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.43it/s]
Loss: 0.5061421010229322
AUROC: 0.8360244625854311
AUPRC: 0.6743735709041797
Sensitivity: 0.7686617350369872
Specificity: 0.7562604340567612
Threshold: 0.15
Accuracy:  0.7595070422535212

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0021.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
Loss: 0.539369635283947
AUROC: 0.8423691825749404
AUPRC: 0.7271388496092628
Sensitivity: 0.7647058823529411
Specificity: 0.7566995768688294
Threshold: 0.14
Accuracy:  0.7589691763516928

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.46it/s]
Loss: 0.5035528924730089
AUROC: 0.8384151637107415
AUPRC: 0.6757510118612311
Sensitivity: 0.769334229993275
Specificity: 0.7569759122346769
Threshold: 0.15
Accuracy:  0.7602112676056338

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0022.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
Loss: 0.5293771512806416
AUROC: 0.842968806961174
AUPRC: 0.7308971802628059
Sensitivity: 0.7575757575757576
Specificity: 0.763046544428773
Threshold: 0.16
Accuracy:  0.7614957049014653

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.45it/s]
Loss: 0.49074583914544845
AUROC: 0.8382746663146747
AUPRC: 0.6761836092734247
Sensitivity: 0.7579018157363819
Specificity: 0.7626997376580015
Threshold: 0.17
Accuracy:  0.761443661971831

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0023.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
Loss: 0.5459915604442358
AUROC: 0.8414226057136536
AUPRC: 0.7275526326832116
Sensitivity: 0.7664884135472371
Specificity: 0.7588152327221439
Threshold: 0.14
Accuracy:  0.7609903991915109

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.47it/s]
Loss: 0.4982162594795227
AUROC: 0.8359463550147868
AUPRC: 0.6739285411986133
Sensitivity: 0.769334229993275
Specificity: 0.753398521345099
Threshold: 0.15
Accuracy:  0.7575704225352112

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0024.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
Loss: 0.5476671457290649
AUROC: 0.8416715064022788
AUPRC: 0.7278085352348856
Sensitivity: 0.7611408199643493
Specificity: 0.765867418899859
Threshold: 0.14
Accuracy:  0.7645275391611925

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.44it/s]
Loss: 0.4990981101989746
AUROC: 0.8370642395474188
AUPRC: 0.6737323070002622
Sensitivity: 0.7639542703429725
Specificity: 0.7598378249463391
Threshold: 0.15
Accuracy:  0.7609154929577465

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0025.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.46it/s]
Loss: 0.5215488933026791
AUROC: 0.8425124890320277
AUPRC: 0.7296122030500344
Sensitivity: 0.7557932263814616
Specificity: 0.7637517630465445
Threshold: 0.17
Accuracy:  0.7614957049014653

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.46it/s]
Loss: 0.484256692065133
AUROC: 0.8373377764298296
AUPRC: 0.6738060745370453
Sensitivity: 0.7579018157363819
Specificity: 0.7593608394943954
Threshold: 0.18
Accuracy:  0.7589788732394366

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0026.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
Loss: 0.503408920019865
AUROC: 0.8405759662500722
AUPRC: 0.729680544568208
Sensitivity: 0.7593582887700535
Specificity: 0.7489421720733427
Threshold: 0.18
Accuracy:  0.7518948964123294

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.45it/s]
Loss: 0.4731508950392405
AUROC: 0.8342341632890891
AUPRC: 0.6724402366942792
Sensitivity: 0.7673167451244116
Specificity: 0.7512520868113522
Threshold: 0.19
Accuracy:  0.7554577464788732

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0027.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
Loss: 0.5657776836305857
AUROC: 0.8428845829907807
AUPRC: 0.7286026681381852
Sensitivity: 0.768270944741533
Specificity: 0.7566995768688294
Threshold: 0.12
Accuracy:  0.7599797877716018

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.46it/s]
Loss: 0.5202397194173601
AUROC: 0.8385139609664232
AUPRC: 0.6735170925570583
Sensitivity: 0.7646267652992602
Specificity: 0.756737419508705
Threshold: 0.13
Accuracy:  0.7588028169014085

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0028.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
Loss: 0.4933346323668957
AUROC: 0.8402434701281463
AUPRC: 0.7285375223082073
Sensitivity: 0.7629233511586453
Specificity: 0.7510578279266573
Threshold: 0.2
Accuracy:  0.7544214249621021

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.45it/s]
Loss: 0.46175946990648903
AUROC: 0.8343395363361391
AUPRC: 0.6719488966757814
Sensitivity: 0.7484868863483524
Specificity: 0.7658001430956356
Threshold: 0.22
Accuracy:  0.7612676056338028

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0029.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.45it/s]
Loss: 0.5036901906132698
AUROC: 0.8407387573570266
AUPRC: 0.7259170465836962
Sensitivity: 0.7629233511586453
Specificity: 0.7609308885754584
Threshold: 0.17
Accuracy:  0.7614957049014653

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.45it/s]
Loss: 0.46894330680370333
AUROC: 0.8359425057710588
AUPRC: 0.6747011485181389
Sensitivity: 0.7679892400806994
Specificity: 0.7505366086334366
Threshold: 0.18
Accuracy:  0.7551056338028169

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0030.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.45it/s]
Loss: 0.5306737087666988
AUROC: 0.8417280747406027
AUPRC: 0.7271942815702852
Sensitivity: 0.7664884135472371
Specificity: 0.7574047954866009
Threshold: 0.15
Accuracy:  0.7599797877716018

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.45it/s]
Loss: 0.4853099094496833
AUROC: 0.837455258556107
AUPRC: 0.6748275870115412
Sensitivity: 0.7659717552118359
Specificity: 0.7526830431671834
Threshold: 0.16
Accuracy:  0.7561619718309859

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0031.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
Loss: 0.474030252546072
AUROC: 0.8411605057460863
AUPRC: 0.7292282575012339
Sensitivity: 0.750445632798574
Specificity: 0.765867418899859
Threshold: 0.21
Accuracy:  0.7614957049014653

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.46it/s]
Loss: 0.4526939779520035
AUROC: 0.8364074623363531
AUPRC: 0.6736816146309441
Sensitivity: 0.7552118359112306
Specificity: 0.7586453613164799
Threshold: 0.22
Accuracy:  0.7577464788732394

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0032.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
Loss: 0.6126263812184334
AUROC: 0.8407777266567609
AUPRC: 0.7232256689968566
Sensitivity: 0.768270944741533
Specificity: 0.763046544428773
Threshold: 0.09
Accuracy:  0.7645275391611925

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.45it/s]
Loss: 0.550629484653473
AUROC: 0.8373346489193007
AUPRC: 0.6726617181774107
Sensitivity: 0.7626092804303968
Specificity: 0.760076317672311
Threshold: 0.1
Accuracy:  0.7607394366197183

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0033.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
Loss: 0.5490272715687752
AUROC: 0.8441089732469472
AUPRC: 0.7298544048789909
Sensitivity: 0.7700534759358288
Specificity: 0.7524682651622003
Threshold: 0.11
Accuracy:  0.7574532592218292

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.46it/s]
Loss: 0.5091069797674815
AUROC: 0.8407265543767426
AUPRC: 0.6768059507077916
Sensitivity: 0.7726967047747142
Specificity: 0.7569759122346769
Threshold: 0.12
Accuracy:  0.7610915492957746

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0034.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
Loss: 0.4801405258476734
AUROC: 0.8399618855107116
AUPRC: 0.7257222846575512
Sensitivity: 0.7540106951871658
Specificity: 0.765867418899859
Threshold: 0.18
Accuracy:  0.7625063163213744

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.44it/s]
Loss: 0.44789157377349004
AUROC: 0.8363227789743402
AUPRC: 0.6747795525113907
Sensitivity: 0.7646267652992602
Specificity: 0.7564989267827331
Threshold: 0.19
Accuracy:  0.7586267605633803

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0035.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
Loss: 0.4919229503720999
AUROC: 0.844122801062982
AUPRC: 0.729109039992084
Sensitivity: 0.7647058823529411
Specificity: 0.7637517630465445
Threshold: 0.18
Accuracy:  0.764022233451238

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.47it/s]
Loss: 0.4544056839413113
AUROC: 0.8410895861758262
AUPRC: 0.6766441052953587
Sensitivity: 0.7700067249495629
Specificity: 0.7531600286191271
Threshold: 0.19
Accuracy:  0.7575704225352112

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0036.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.46it/s]
Loss: 0.5043614283204079
AUROC: 0.8408166959564952
AUPRC: 0.7261729942078546
Sensitivity: 0.7700534759358288
Specificity: 0.7538787023977433
Threshold: 0.16
Accuracy:  0.7584638706417383

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.45it/s]
Loss: 0.4658162474632263
AUROC: 0.8375492442571288
AUPRC: 0.6752573383894885
Sensitivity: 0.7565568258238063
Specificity: 0.7626997376580015
Threshold: 0.18
Accuracy:  0.7610915492957746

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0037.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.45it/s]
Loss: 0.49080137722194195
AUROC: 0.8413358675948903
AUPRC: 0.7277983221410083
Sensitivity: 0.7593582887700535
Specificity: 0.7651622002820875
Threshold: 0.2
Accuracy:  0.7635169277412834

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.46it/s]
Loss: 0.4590514792336358
AUROC: 0.8382182107399996
AUPRC: 0.6756473353828707
Sensitivity: 0.7518493611297915
Specificity: 0.7612687813021702
Threshold: 0.21
Accuracy:  0.7588028169014085

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0038.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
Loss: 0.47844384983181953
AUROC: 0.8404307741817076
AUPRC: 0.7278973836635035
Sensitivity: 0.7522281639928698
Specificity: 0.7588152327221439
Threshold: 0.21
Accuracy:  0.7569479535118747

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.44it/s]
Loss: 0.45256661309136287
AUROC: 0.8373247050396705
AUPRC: 0.673152002208163
Sensitivity: 0.7572293207800942
Specificity: 0.7579298831385642
Threshold: 0.22
Accuracy:  0.7577464788732394

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0039.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.44it/s]
Loss: 0.5584726110100746
AUROC: 0.8384332833017807
AUPRC: 0.7235848051430516
Sensitivity: 0.750445632798574
Specificity: 0.7693935119887165
Threshold: 0.12
Accuracy:  0.764022233451238

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.43it/s]
Loss: 0.5118756314118703
AUROC: 0.8352457124637388
AUPRC: 0.6721864394266904
Sensitivity: 0.7558843308675185
Specificity: 0.7629382303839732
Threshold: 0.13
Accuracy:  0.7610915492957746

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0040.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.45it/s]
Loss: 0.49241720139980316
AUROC: 0.8385451629042437
AUPRC: 0.7196621297477097
Sensitivity: 0.7540106951871658
Specificity: 0.7708039492242595
Threshold: 0.17
Accuracy:  0.7660434562910561

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.44it/s]
Loss: 0.4575848665502336
AUROC: 0.8356974372537185
AUPRC: 0.6715548824519832
Sensitivity: 0.7612642905178211
Specificity: 0.7598378249463391
Threshold: 0.18
Accuracy:  0.7602112676056338

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0041.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.46it/s]
Loss: 0.500302717089653
AUROC: 0.8453798752479578
AUPRC: 0.7320842714504371
Sensitivity: 0.7593582887700535
Specificity: 0.765867418899859
Threshold: 0.17
Accuracy:  0.764022233451238

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.42it/s]
Loss: 0.463620611694124
AUROC: 0.8430174157428616
AUPRC: 0.6768214369431189
Sensitivity: 0.7632817753866846
Specificity: 0.7629382303839732
Threshold: 0.18
Accuracy:  0.7630281690140845

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0042.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.46it/s]
Loss: 0.5096276607364416
AUROC: 0.8454427289572066
AUPRC: 0.7325905834778305
Sensitivity: 0.7611408199643493
Specificity: 0.7679830747531735
Threshold: 0.17
Accuracy:  0.7660434562910561

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.44it/s]
Loss: 0.46958664920594956
AUROC: 0.8427453223268486
AUPRC: 0.6758650050697931
Sensitivity: 0.7639542703429725
Specificity: 0.7619842594800859
Threshold: 0.18
Accuracy:  0.7625

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0043.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
Loss: 0.47115970216691494
AUROC: 0.8430241182253129
AUPRC: 0.7289283371470565
Sensitivity: 0.7629233511586453
Specificity: 0.7637517630465445
Threshold: 0.18
Accuracy:  0.7635169277412834

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.43it/s]
Loss: 0.4439871927102407
AUROC: 0.8407397861520569
AUPRC: 0.6787571179035261
Sensitivity: 0.7558843308675185
Specificity: 0.7693775339852135
Threshold: 0.2
Accuracy:  0.7658450704225352

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0044.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
Loss: 0.4504497144371271
AUROC: 0.8413823793397344
AUPRC: 0.7278539537575502
Sensitivity: 0.7664884135472371
Specificity: 0.7616361071932299
Threshold: 0.22
Accuracy:  0.763011622031329

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.46it/s]
Loss: 0.4309218082163069
AUROC: 0.8389861348637071
AUPRC: 0.6772752881568752
Sensitivity: 0.7639542703429725
Specificity: 0.7588838540424517
Threshold: 0.23
Accuracy:  0.7602112676056338

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0045.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
Loss: 0.47530699893832207
AUROC: 0.847416335427619
AUPRC: 0.7336851670892623
Sensitivity: 0.7593582887700535
Specificity: 0.7729196050775741
Threshold: 0.2
Accuracy:  0.7690752905507833

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.46it/s]
Loss: 0.4462843610180749
AUROC: 0.8479866290103706
AUPRC: 0.6795174342381647
Sensitivity: 0.7565568258238063
Specificity: 0.7760553303124255
Threshold: 0.21
Accuracy:  0.7709507042253522

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0046.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
Loss: 0.4825980067253113
AUROC: 0.8467940837060557
AUPRC: 0.7326559744293707
Sensitivity: 0.7575757575757576
Specificity: 0.767277856135402
Threshold: 0.17
Accuracy:  0.7645275391611925

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.45it/s]
Loss: 0.45165329310629104
AUROC: 0.8453694640457381
AUPRC: 0.6821534655547332
Sensitivity: 0.7585743106926698
Specificity: 0.770093012163129
Threshold: 0.18
Accuracy:  0.7670774647887324

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0047.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
Loss: 0.535110104829073
AUROC: 0.8406972739089226
AUPRC: 0.725188353502842
Sensitivity: 0.7522281639928698
Specificity: 0.7708039492242595
Threshold: 0.13
Accuracy:  0.7655381505811015

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.46it/s]
Loss: 0.4911608298619588
AUROC: 0.8386687326413141
AUPRC: 0.6758860411471659
Sensitivity: 0.7612642905178211
Specificity: 0.7622227522060577
Threshold: 0.14
Accuracy:  0.7619718309859155

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0048.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.45it/s]
Loss: 0.5002718456089497
AUROC: 0.8408745213690041
AUPRC: 0.726347743677447
Sensitivity: 0.7647058823529411
Specificity: 0.7566995768688294
Threshold: 0.16
Accuracy:  0.7589691763516928

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.44it/s]
Loss: 0.4610769093036652
AUROC: 0.8391251887933759
AUPRC: 0.6774532729590755
Sensitivity: 0.7525218560860794
Specificity: 0.768423563081326
Threshold: 0.18
Accuracy:  0.7642605633802817

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0049.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
Loss: 0.42539431527256966
AUROC: 0.8456136910463635
AUPRC: 0.7257671256028883
Sensitivity: 0.768270944741533
Specificity: 0.7623413258110014
Threshold: 0.26
Accuracy:  0.764022233451238

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.46it/s]
Loss: 0.41237973537709977
AUROC: 0.8446061108989573
AUPRC: 0.6819439025136731
Sensitivity: 0.7659717552118359
Specificity: 0.7705699976150727
Threshold: 0.28
Accuracy:  0.7693661971830986

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0050.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.44it/s]
Loss: 0.47782506607472897
AUROC: 0.8469097345310735
AUPRC: 0.7309371522268588
Sensitivity: 0.7718360071301248
Specificity: 0.7609308885754584
Threshold: 0.21
Accuracy:  0.764022233451238

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.44it/s]
Loss: 0.44386572738488517
AUROC: 0.8454842998169525
AUPRC: 0.680916366398144
Sensitivity: 0.7673167451244116
Specificity: 0.7631767231099451
Threshold: 0.22
Accuracy:  0.7642605633802817

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0051.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.45it/s]
Loss: 0.5042339731007814
AUROC: 0.8465904376880897
AUPRC: 0.732610333485597
Sensitivity: 0.768270944741533
Specificity: 0.763046544428773
Threshold: 0.19
Accuracy:  0.7645275391611925

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.46it/s]
Loss: 0.4670453088151084
AUROC: 0.8441660140327387
AUPRC: 0.6785268649860522
Sensitivity: 0.7673167451244116
Specificity: 0.7626997376580015
Threshold: 0.2
Accuracy:  0.7639084507042253

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0052.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
Loss: 0.4836157038807869
AUROC: 0.8413773510429945
AUPRC: 0.7243458757602256
Sensitivity: 0.7754010695187166
Specificity: 0.7510578279266573
Threshold: 0.21
Accuracy:  0.7579585649317837

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.44it/s]
Loss: 0.4513730376958847
AUROC: 0.8446416362108622
AUPRC: 0.6804107544556933
Sensitivity: 0.7491593813046402
Specificity: 0.7844025757214405
Threshold: 0.22
Accuracy:  0.7751760563380282

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0053.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.44it/s]
Loss: 0.43293687142431736
AUROC: 0.8421429092216448
AUPRC: 0.725648378722508
Sensitivity: 0.7664884135472371
Specificity: 0.7545839210155149
Threshold: 0.25
Accuracy:  0.7579585649317837

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.45it/s]
Loss: 0.4155806481838226
AUROC: 0.841627838757105
AUPRC: 0.6809105192421268
Sensitivity: 0.7713517148621385
Specificity: 0.7612687813021702
Threshold: 0.26
Accuracy:  0.7639084507042253

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0054.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
Loss: 0.43948535434901714
AUROC: 0.8410970234997448
AUPRC: 0.7277147073144837
Sensitivity: 0.7754010695187166
Specificity: 0.7545839210155149
Threshold: 0.25
Accuracy:  0.7604850934815564

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.44it/s]
Loss: 0.42186329894595676
AUROC: 0.8390283161595583
AUPRC: 0.677821250537787
Sensitivity: 0.7652992602555481
Specificity: 0.7562604340567612
Threshold: 0.26
Accuracy:  0.7586267605633803

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0055.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
Loss: 0.4514637775719166
AUROC: 0.8413346105207054
AUPRC: 0.7270777039352052
Sensitivity: 0.7647058823529411
Specificity: 0.7524682651622003
Threshold: 0.21
Accuracy:  0.7559373420919656

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.46it/s]
Loss: 0.4291424870491028
AUROC: 0.8398587904938436
AUPRC: 0.6770826032399426
Sensitivity: 0.7726967047747142
Specificity: 0.7512520868113522
Threshold: 0.22
Accuracy:  0.7568661971830986

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0056.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
Loss: 0.4555010423064232
AUROC: 0.8400737651131743
AUPRC: 0.7234992701629736
Sensitivity: 0.7629233511586453
Specificity: 0.7588152327221439
Threshold: 0.22
Accuracy:  0.7599797877716018

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.44it/s]
Loss: 0.433478468325403
AUROC: 0.8376005675068336
AUPRC: 0.6759252957247742
Sensitivity: 0.7605917955615333
Specificity: 0.7548294777009301
Threshold: 0.23
Accuracy:  0.7563380281690141

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0057.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
Loss: 0.45781283266842365
AUROC: 0.8408016110662755
AUPRC: 0.7268142729585125
Sensitivity: 0.7593582887700535
Specificity: 0.770098730606488
Threshold: 0.25
Accuracy:  0.7670540677109652

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.45it/s]
Loss: 0.435899195406172
AUROC: 0.8398623189672607
AUPRC: 0.679239888300966
Sensitivity: 0.7740416946872899
Specificity: 0.7448127832101121
Threshold: 0.25
Accuracy:  0.7524647887323944

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0058.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.46it/s]
Loss: 0.46584317460656166
AUROC: 0.8403075809115801
AUPRC: 0.7260878078389019
Sensitivity: 0.7700534759358288
Specificity: 0.7531734837799718
Threshold: 0.24
Accuracy:  0.7579585649317837

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.45it/s]
Loss: 0.4403448932700687
AUROC: 0.8449413960661691
AUPRC: 0.6821331836241291
Sensitivity: 0.7545393409549428
Specificity: 0.7817791557357501
Threshold: 0.25
Accuracy:  0.7746478873239436

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0059.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
Loss: 0.46019774302840233
AUROC: 0.8415231716484516
AUPRC: 0.7282795891179853
Sensitivity: 0.7611408199643493
Specificity: 0.7616361071932299
Threshold: 0.2
Accuracy:  0.7614957049014653

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.46it/s]
Loss: 0.43219871189859177
AUROC: 0.840449328635759
AUPRC: 0.6793217721295756
Sensitivity: 0.7659717552118359
Specificity: 0.7536370140710709
Threshold: 0.21
Accuracy:  0.7568661971830986

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0060.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.45it/s]
Loss: 0.43688652105629444
AUROC: 0.8439845229026345
AUPRC: 0.7280856658285255
Sensitivity: 0.7754010695187166
Specificity: 0.7574047954866009
Threshold: 0.28
Accuracy:  0.7625063163213744

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.45it/s]
Loss: 0.425673246383667
AUROC: 0.843188947666484
AUPRC: 0.6816579341681198
Sensitivity: 0.7747141896435776
Specificity: 0.7605533031242547
Threshold: 0.29
Accuracy:  0.7642605633802817

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0061.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
Loss: 0.4874818027019501
AUROC: 0.840511226929546
AUPRC: 0.7258977661757795
Sensitivity: 0.7575757575757576
Specificity: 0.7574047954866009
Threshold: 0.18
Accuracy:  0.7574532592218292

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.46it/s]
Loss: 0.45121292736795215
AUROC: 0.8366406623521991
AUPRC: 0.6763771538638285
Sensitivity: 0.7505043712172159
Specificity: 0.7631767231099451
Threshold: 0.2
Accuracy:  0.7598591549295775

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0062.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.46it/s]
Loss: 0.4779135696589947
AUROC: 0.8408933774817788
AUPRC: 0.7260668974233124
Sensitivity: 0.7647058823529411
Specificity: 0.7538787023977433
Threshold: 0.19
Accuracy:  0.7569479535118747

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.44it/s]
Loss: 0.44351594381862214
AUROC: 0.8408974447597438
AUPRC: 0.6787999480783344
Sensitivity: 0.7659717552118359
Specificity: 0.7607917958502266
Threshold: 0.21
Accuracy:  0.7621478873239437

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0063.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.46it/s]
Loss: 0.45059993490576744
AUROC: 0.8444295271641161
AUPRC: 0.729152089411826
Sensitivity: 0.7771836007130125
Specificity: 0.7503526093088858
Threshold: 0.25
Accuracy:  0.7579585649317837

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.44it/s]
Loss: 0.42867298987176683
AUROC: 0.8428132454401298
AUPRC: 0.6810117971102697
Sensitivity: 0.7579018157363819
Specificity: 0.7746243739565943
Threshold: 0.26
Accuracy:  0.7702464788732394

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0064.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
Loss: 0.456649849191308
AUROC: 0.8419015509781295
AUPRC: 0.7272686295316054
Sensitivity: 0.7664884135472371
Specificity: 0.7552891396332864
Threshold: 0.23
Accuracy:  0.7584638706417383

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.44it/s]
Loss: 0.43046312431494393
AUROC: 0.8413450797282626
AUPRC: 0.6804302620127854
Sensitivity: 0.7585743106926698
Specificity: 0.7715239685189602
Threshold: 0.24
Accuracy:  0.7681338028169014

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0065.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
Loss: 0.44218817725777626
AUROC: 0.8428418424684915
AUPRC: 0.7247750051892374
Sensitivity: 0.7557932263814616
Specificity: 0.771509167842031
Threshold: 0.23
Accuracy:  0.7670540677109652

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.46it/s]
Loss: 0.42016775641176435
AUROC: 0.8418923940708175
AUPRC: 0.6801030092862637
Sensitivity: 0.7767316745124412
Specificity: 0.7491056522776055
Threshold: 0.23
Accuracy:  0.7563380281690141

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0066.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
Loss: 0.44203284941613674
AUROC: 0.8428028731687571
AUPRC: 0.7264550639538542
Sensitivity: 0.7557932263814616
Specificity: 0.7870239774330042
Threshold: 0.26
Accuracy:  0.7781707933299646

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.43it/s]
Loss: 0.42430533866087594
AUROC: 0.8416241498985323
AUPRC: 0.6797293474995962
Sensitivity: 0.773369199731002
Specificity: 0.7572144049606487
Threshold: 0.26
Accuracy:  0.761443661971831

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0067.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.44it/s]
Loss: 0.4565674029290676
AUROC: 0.8438537871873971
AUPRC: 0.7267857449495738
Sensitivity: 0.7700534759358288
Specificity: 0.7531734837799718
Threshold: 0.23
Accuracy:  0.7579585649317837

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.46it/s]
Loss: 0.43117510345247057
AUROC: 0.8443181393525668
AUPRC: 0.6818918471223266
Sensitivity: 0.7673167451244116
Specificity: 0.7557834486048175
Threshold: 0.24
Accuracy:  0.7588028169014085

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0068.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
Loss: 0.4493854846805334
AUROC: 0.8422887298271021
AUPRC: 0.7280637193202748
Sensitivity: 0.7700534759358288
Specificity: 0.7545839210155149
Threshold: 0.23
Accuracy:  0.7589691763516928

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.44it/s]
Loss: 0.42723261813322705
AUROC: 0.8400912687764907
AUPRC: 0.6790867640086877
Sensitivity: 0.7518493611297915
Specificity: 0.7710469830670165
Threshold: 0.25
Accuracy:  0.7660211267605633

Intermediate Model:
  ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0069.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.45it/s]
Loss: 0.4508330076932907
AUROC: 0.8431762242016951
AUPRC: 0.728551469157394
Sensitivity: 0.7754010695187166
Specificity: 0.7461212976022567
Threshold: 0.26
Accuracy:  0.7544214249621021

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.46it/s]
Loss: 0.4306994656721751
AUROC: 0.8424508551816676
AUPRC: 0.6813727160694755
Sensitivity: 0.7605917955615333
Specificity: 0.7729549248747913
Threshold: 0.27
Accuracy:  0.7697183098591549


Plot AUROC/AUPRC for Each Intermediate Model
  Epoch with best Validation Loss:      49, 0.4216
  Epoch with best model Test AUROC:     45, 0.848
  Epoch with best model Test Accuracy:   0, 0.7782

AUROC/AUPRC Plots - Best Model Based on Validation Loss
  Epoch with best Validation Loss:   49, 0.4216
  Best Model Based on Validation Loss:
    ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0049.model

Generate Stats Based on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.45it/s]
Loss: 0.41237973537709977
AUROC: 0.8446061108989573
AUPRC: 0.6819439025136731
Sensitivity: 0.7659717552118359
Specificity: 0.7705699976150727
Threshold: 0.28
Accuracy:  0.7693661971830986
best_model_val_test_auroc: 0.8446061108989573
best_model_val_test_auprc: 0.6819439025136731

AUROC/AUPRC Plots - Best Model Based on Model AUROC
  Epoch with best model Test AUROC:  45, 0.848
  Best Model Based on Model AUROC:
    ./vitaldb_cache/models/ABP_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_e5f00beb_0045.model

Generate Stats Based on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.44it/s]
Loss: 0.4462843610180749
AUROC: 0.8479866290103706
AUPRC: 0.6795174342381647
Sensitivity: 0.7565568258238063
Specificity: 0.7760553303124255
Threshold: 0.21
Accuracy:  0.7709507042253522
best_model_auroc_test_auroc: 0.8479866290103706
best_model_auroc_test_auprc: 0.6795174342381647

Total Processing Time: 6226.9050 sec
In [109]:
RUN_ME = True
DISPLAY_MODEL_PREDICTION=True
DISPLAY_MODEL_PREDICTION_FIRST_ONLY=True

if MULTI_RUN and RUN_ME:
    (model, best_model_val_loss, best_model_auroc, experimentName) = run_experiment(
        experimentNamePrefix=None, 
        useAbp=True, 
        useEeg=True, 
        useEcg=False,
        nResiduals=12, 
        skip_connection=False,
        batch_size=128,
        learning_rate=1e-4,
        weight_decay=1e-1,
        balance_labels=False,
        #pos_weight=2.0,
        pos_weight=None,
        max_epochs=MAX_EPOCHS,
        patience=PATIENCE,
        device=device
    )
    
    if DISPLAY_MODEL_PREDICTION:
        for case_id_to_check in my_cases_of_interest_idx:
            preds = predictionsForModel(case_id_to_check, model, best_model_val_loss, device)
            printModelPrediction(case_id_to_check, positiveSegmentsMap, 
                            negativeSegmentsMap, iohEventsMap, cleanEventsMap, preds, experimentName)

            if DISPLAY_MODEL_PREDICTION_FIRST_ONLY:
                break
Experiment Setup
  name:              ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20
  prediction_window: 003
  max_cases:         _ALL
  use_abp:           True
  use_eeg:           True
  use_ecg:           False
  n_residuals:       12
  skip_connection:   False
  batch_size:        128
  learning_rate:     0.0001
  weight_decay:      0.1
  balance_labels:    False
  max_epochs:        200
  patience:          20
  device:            mps

Model Architecture
HypotensionCNN(
  (abpResiduals): Sequential(
    (0): ResidualBlock(
      (bn1): BatchNorm1d(1, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(1, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (bn2): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (residualConv): Conv1d(1, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    )
    (1): ResidualBlock(
      (bn1): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (bn2): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (residualConv): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
    )
    (2): ResidualBlock(
      (bn1): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (bn2): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (residualConv): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    )
    (3): ResidualBlock(
      (bn1): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (bn2): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (residualConv): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
    )
    (4): ResidualBlock(
      (bn1): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (bn2): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (residualConv): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    )
    (5): ResidualBlock(
      (bn1): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(2, 4, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (bn2): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(4, 4, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (residualConv): Conv1d(2, 4, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
    )
    (6): ResidualBlock(
      (bn1): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    )
    (7): ResidualBlock(
      (bn1): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
    )
    (8): ResidualBlock(
      (bn1): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=1, dilation=1, ceil_mode=False)
    )
    (9): ResidualBlock(
      (bn1): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
    )
    (10): ResidualBlock(
      (bn1): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(4, 6, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(6, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(6, 6, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(4, 6, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    )
    (11): ResidualBlock(
      (bn1): BatchNorm1d(6, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(6, 6, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(6, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(6, 6, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(6, 6, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
    )
  )
  (abpFc): Linear(in_features=2814, out_features=32, bias=True)
  (eegResiduals): Sequential(
    (0): ResidualBlock(
      (bn1): BatchNorm1d(1, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(1, 2, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(2, 2, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(1, 2, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    )
    (1): ResidualBlock(
      (bn1): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(2, 2, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(2, 2, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(2, 2, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
    )
    (2): ResidualBlock(
      (bn1): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(2, 2, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(2, 2, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(2, 2, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    )
    (3): ResidualBlock(
      (bn1): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(2, 2, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(2, 2, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(2, 2, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
    )
    (4): ResidualBlock(
      (bn1): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(2, 2, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(2, 2, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(2, 2, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    )
    (5): ResidualBlock(
      (bn1): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(2, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(2, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
    )
    (6): ResidualBlock(
      (bn1): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(4, 4, kernel_size=(3,), stride=(1,), padding=(1,), bias=False)
      (bn2): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(4, 4, kernel_size=(3,), stride=(1,), padding=(1,), bias=False)
      (residualConv): Conv1d(4, 4, kernel_size=(3,), stride=(1,), padding=(1,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    )
    (7): ResidualBlock(
      (bn1): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(4, 4, kernel_size=(3,), stride=(1,), padding=(1,), bias=False)
      (bn2): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(4, 4, kernel_size=(3,), stride=(1,), padding=(1,), bias=False)
      (residualConv): Conv1d(4, 4, kernel_size=(3,), stride=(1,), padding=(1,), bias=False)
    )
    (8): ResidualBlock(
      (bn1): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(4, 4, kernel_size=(3,), stride=(1,), padding=(1,), bias=False)
      (bn2): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(4, 4, kernel_size=(3,), stride=(1,), padding=(1,), bias=False)
      (residualConv): Conv1d(4, 4, kernel_size=(3,), stride=(1,), padding=(1,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    )
    (9): ResidualBlock(
      (bn1): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(4, 4, kernel_size=(3,), stride=(1,), padding=(1,), bias=False)
      (bn2): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(4, 4, kernel_size=(3,), stride=(1,), padding=(1,), bias=False)
      (residualConv): Conv1d(4, 4, kernel_size=(3,), stride=(1,), padding=(1,), bias=False)
    )
    (10): ResidualBlock(
      (bn1): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(4, 6, kernel_size=(3,), stride=(1,), padding=(1,), bias=False)
      (bn2): BatchNorm1d(6, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(6, 6, kernel_size=(3,), stride=(1,), padding=(1,), bias=False)
      (residualConv): Conv1d(4, 6, kernel_size=(3,), stride=(1,), padding=(1,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    )
    (11): ResidualBlock(
      (bn1): BatchNorm1d(6, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(6, 6, kernel_size=(3,), stride=(1,), padding=(1,), bias=False)
      (bn2): BatchNorm1d(6, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(6, 6, kernel_size=(3,), stride=(1,), padding=(1,), bias=False)
      (residualConv): Conv1d(6, 6, kernel_size=(3,), stride=(1,), padding=(1,), bias=False)
    )
  )
  (eegFc): Linear(in_features=720, out_features=32, bias=True)
  (fullLinear1): Linear(in_features=64, out_features=16, bias=True)
  (fullLinear2): Linear(in_features=16, out_features=1, bias=True)
  (sigmoid): Sigmoid()
)

Training Loop
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:51<00:00,  1.78it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.43it/s]
[2024-05-05 07:18:26.074808] Completed epoch 0 with training loss 0.52067941, validation loss 0.58197838
Validation loss improved to 0.58197838. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:47<00:00,  1.92it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.51it/s]
[2024-05-05 07:19:20.584672] Completed epoch 1 with training loss 0.45249766, validation loss 0.62700230
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.90it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.51it/s]
[2024-05-05 07:20:15.376850] Completed epoch 2 with training loss 0.44580925, validation loss 0.61036634
No improvement in validation loss. 2 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.91it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.53it/s]
[2024-05-05 07:21:09.869460] Completed epoch 3 with training loss 0.43858019, validation loss 0.62134653
No improvement in validation loss. 3 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.91it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.50it/s]
[2024-05-05 07:22:04.526794] Completed epoch 4 with training loss 0.43997502, validation loss 0.63151473
No improvement in validation loss. 4 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.91it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.52it/s]
[2024-05-05 07:22:59.143836] Completed epoch 5 with training loss 0.44057211, validation loss 0.58018810
Validation loss improved to 0.58018810. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.91it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.52it/s]
[2024-05-05 07:23:53.800833] Completed epoch 6 with training loss 0.44102722, validation loss 0.61198688
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.91it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.51it/s]
[2024-05-05 07:24:48.387242] Completed epoch 7 with training loss 0.43785834, validation loss 0.56178164
Validation loss improved to 0.56178164. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.91it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.46it/s]
[2024-05-05 07:25:43.165000] Completed epoch 8 with training loss 0.43513700, validation loss 0.60046405
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.91it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.52it/s]
[2024-05-05 07:26:37.641630] Completed epoch 9 with training loss 0.44184667, validation loss 0.56145406
Validation loss improved to 0.56145406. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.90it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.51it/s]
[2024-05-05 07:27:32.655482] Completed epoch 10 with training loss 0.44001263, validation loss 0.53462195
Validation loss improved to 0.53462195. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.91it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.51it/s]
[2024-05-05 07:28:27.306060] Completed epoch 11 with training loss 0.44025588, validation loss 0.56097221
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:47<00:00,  1.92it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:07<00:00,  2.27it/s]
[2024-05-05 07:29:22.328042] Completed epoch 12 with training loss 0.43574342, validation loss 0.56533360
No improvement in validation loss. 2 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:47<00:00,  1.92it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.54it/s]
[2024-05-05 07:30:16.515322] Completed epoch 13 with training loss 0.43772271, validation loss 0.54829186
No improvement in validation loss. 3 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:47<00:00,  1.92it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
[2024-05-05 07:31:10.898397] Completed epoch 14 with training loss 0.43581137, validation loss 0.56125402
No improvement in validation loss. 4 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:49<00:00,  1.88it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.53it/s]
[2024-05-05 07:32:06.297666] Completed epoch 15 with training loss 0.43844613, validation loss 0.52834481
Validation loss improved to 0.52834481. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:47<00:00,  1.92it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.52it/s]
[2024-05-05 07:33:00.622133] Completed epoch 16 with training loss 0.43602604, validation loss 0.56905550
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.91it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.52it/s]
[2024-05-05 07:33:55.188665] Completed epoch 17 with training loss 0.43746528, validation loss 0.55188495
No improvement in validation loss. 2 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.91it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.53it/s]
[2024-05-05 07:34:49.663213] Completed epoch 18 with training loss 0.43065047, validation loss 0.55638468
No improvement in validation loss. 3 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.92it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.50it/s]
[2024-05-05 07:35:44.155554] Completed epoch 19 with training loss 0.43275455, validation loss 0.55607522
No improvement in validation loss. 4 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.91it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.52it/s]
[2024-05-05 07:36:38.729618] Completed epoch 20 with training loss 0.43141559, validation loss 0.57124084
No improvement in validation loss. 5 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.92it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.52it/s]
[2024-05-05 07:37:33.162052] Completed epoch 21 with training loss 0.42947176, validation loss 0.51110560
Validation loss improved to 0.51110560. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.91it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
[2024-05-05 07:38:27.912243] Completed epoch 22 with training loss 0.43250167, validation loss 0.50454634
Validation loss improved to 0.50454634. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.91it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.53it/s]
[2024-05-05 07:39:22.512324] Completed epoch 23 with training loss 0.43079633, validation loss 0.57776016
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.91it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.52it/s]
[2024-05-05 07:40:17.112720] Completed epoch 24 with training loss 0.42970553, validation loss 0.50227660
Validation loss improved to 0.50227660. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.91it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.51it/s]
[2024-05-05 07:41:11.668609] Completed epoch 25 with training loss 0.43029836, validation loss 0.49041334
Validation loss improved to 0.49041334. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.91it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.52it/s]
[2024-05-05 07:42:06.311375] Completed epoch 26 with training loss 0.43061164, validation loss 0.50198627
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.89it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.51it/s]
[2024-05-05 07:43:01.536221] Completed epoch 27 with training loss 0.42937046, validation loss 0.58962500
No improvement in validation loss. 2 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.91it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.51it/s]
[2024-05-05 07:43:56.111143] Completed epoch 28 with training loss 0.42915162, validation loss 0.48009935
Validation loss improved to 0.48009935. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.91it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.52it/s]
[2024-05-05 07:44:50.685048] Completed epoch 29 with training loss 0.42878437, validation loss 0.55984181
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.91it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
[2024-05-05 07:45:45.422753] Completed epoch 30 with training loss 0.43071303, validation loss 0.59059048
No improvement in validation loss. 2 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.89it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.52it/s]
[2024-05-05 07:46:40.585205] Completed epoch 31 with training loss 0.43128034, validation loss 0.50862813
No improvement in validation loss. 3 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.91it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.51it/s]
[2024-05-05 07:47:35.085703] Completed epoch 32 with training loss 0.42986518, validation loss 0.50976050
No improvement in validation loss. 4 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.91it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.50it/s]
[2024-05-05 07:48:29.787581] Completed epoch 33 with training loss 0.43042967, validation loss 0.50224233
No improvement in validation loss. 5 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.91it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.53it/s]
[2024-05-05 07:49:24.228991] Completed epoch 34 with training loss 0.42769516, validation loss 0.47104454
Validation loss improved to 0.47104454. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.91it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.52it/s]
[2024-05-05 07:50:18.811653] Completed epoch 35 with training loss 0.42622575, validation loss 0.46876019
Validation loss improved to 0.46876019. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.91it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.52it/s]
[2024-05-05 07:51:13.428679] Completed epoch 36 with training loss 0.42733043, validation loss 0.48842484
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.89it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.53it/s]
[2024-05-05 07:52:08.555178] Completed epoch 37 with training loss 0.42725375, validation loss 0.54367763
No improvement in validation loss. 2 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.91it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.51it/s]
[2024-05-05 07:53:03.098892] Completed epoch 38 with training loss 0.42642772, validation loss 0.49204049
No improvement in validation loss. 3 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.91it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.52it/s]
[2024-05-05 07:53:57.735411] Completed epoch 39 with training loss 0.42876050, validation loss 0.46401832
Validation loss improved to 0.46401832. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.89it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.52it/s]
[2024-05-05 07:54:52.945287] Completed epoch 40 with training loss 0.43121243, validation loss 0.52334237
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.91it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.49it/s]
[2024-05-05 07:55:47.526206] Completed epoch 41 with training loss 0.42564791, validation loss 0.52337080
No improvement in validation loss. 2 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.91it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.52it/s]
[2024-05-05 07:56:42.078819] Completed epoch 42 with training loss 0.42743945, validation loss 0.46926281
No improvement in validation loss. 3 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.91it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.53it/s]
[2024-05-05 07:57:36.561763] Completed epoch 43 with training loss 0.42538592, validation loss 0.51478481
No improvement in validation loss. 4 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.91it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.50it/s]
[2024-05-05 07:58:31.153060] Completed epoch 44 with training loss 0.42853490, validation loss 0.50131547
No improvement in validation loss. 5 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.91it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.52it/s]
[2024-05-05 07:59:25.754183] Completed epoch 45 with training loss 0.42647704, validation loss 0.45071602
Validation loss improved to 0.45071602. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.91it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.51it/s]
[2024-05-05 08:00:20.372040] Completed epoch 46 with training loss 0.42487931, validation loss 0.44095129
Validation loss improved to 0.44095129. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.91it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.52it/s]
[2024-05-05 08:01:15.013810] Completed epoch 47 with training loss 0.42730042, validation loss 0.47809589
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.91it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.51it/s]
[2024-05-05 08:02:09.672576] Completed epoch 48 with training loss 0.42708537, validation loss 0.49697754
No improvement in validation loss. 2 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:49<00:00,  1.87it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.52it/s]
[2024-05-05 08:03:05.371202] Completed epoch 49 with training loss 0.42649660, validation loss 0.42645496
Validation loss improved to 0.42645496. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.89it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.51it/s]
[2024-05-05 08:04:00.571696] Completed epoch 50 with training loss 0.42463046, validation loss 0.59366918
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.91it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.52it/s]
[2024-05-05 08:04:55.238170] Completed epoch 51 with training loss 0.42609233, validation loss 0.53760529
No improvement in validation loss. 2 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.91it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.51it/s]
[2024-05-05 08:05:49.921236] Completed epoch 52 with training loss 0.42606398, validation loss 0.49715802
No improvement in validation loss. 3 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.92it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.52it/s]
[2024-05-05 08:06:44.371871] Completed epoch 53 with training loss 0.42312986, validation loss 0.45242694
No improvement in validation loss. 4 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.92it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.51it/s]
[2024-05-05 08:07:38.845509] Completed epoch 54 with training loss 0.42630118, validation loss 0.46196172
No improvement in validation loss. 5 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.91it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.50it/s]
[2024-05-05 08:08:33.461472] Completed epoch 55 with training loss 0.42710286, validation loss 0.44791645
No improvement in validation loss. 6 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.91it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.52it/s]
[2024-05-05 08:09:27.939485] Completed epoch 56 with training loss 0.42426801, validation loss 0.45083228
No improvement in validation loss. 7 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.89it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.52it/s]
[2024-05-05 08:10:23.138044] Completed epoch 57 with training loss 0.42954317, validation loss 0.50412488
No improvement in validation loss. 8 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:48<00:00,  1.90it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.54it/s]
[2024-05-05 08:11:17.963241] Completed epoch 58 with training loss 0.42685416, validation loss 0.53713495
No improvement in validation loss. 9 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:47<00:00,  1.92it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.54it/s]
[2024-05-05 08:12:12.262628] Completed epoch 59 with training loss 0.42507836, validation loss 0.46556437
No improvement in validation loss. 10 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:47<00:00,  1.92it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.52it/s]
[2024-05-05 08:13:06.531933] Completed epoch 60 with training loss 0.42377430, validation loss 0.43272328
No improvement in validation loss. 11 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:47<00:00,  1.92it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.54it/s]
[2024-05-05 08:14:00.832559] Completed epoch 61 with training loss 0.42520976, validation loss 0.61897093
No improvement in validation loss. 12 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:47<00:00,  1.92it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.55it/s]
[2024-05-05 08:14:55.016557] Completed epoch 62 with training loss 0.42685601, validation loss 0.42711484
No improvement in validation loss. 13 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:47<00:00,  1.92it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.51it/s]
[2024-05-05 08:15:49.454800] Completed epoch 63 with training loss 0.42286682, validation loss 0.55163848
No improvement in validation loss. 14 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:47<00:00,  1.92it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.53it/s]
[2024-05-05 08:16:43.767960] Completed epoch 64 with training loss 0.42338845, validation loss 0.47268242
No improvement in validation loss. 15 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:47<00:00,  1.92it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.53it/s]
[2024-05-05 08:17:38.124889] Completed epoch 65 with training loss 0.42761073, validation loss 0.49037874
No improvement in validation loss. 16 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:47<00:00,  1.92it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.54it/s]
[2024-05-05 08:18:32.435267] Completed epoch 66 with training loss 0.42681313, validation loss 0.55548954
No improvement in validation loss. 17 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:47<00:00,  1.93it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.53it/s]
[2024-05-05 08:19:26.577079] Completed epoch 67 with training loss 0.42711949, validation loss 0.54232180
No improvement in validation loss. 18 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:47<00:00,  1.92it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.53it/s]
[2024-05-05 08:20:20.908664] Completed epoch 68 with training loss 0.42454234, validation loss 0.54427397
No improvement in validation loss. 19 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:47<00:00,  1.93it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.53it/s]
[2024-05-05 08:21:15.075974] Completed epoch 69 with training loss 0.42339718, validation loss 0.52849489
No improvement in validation loss. 20 epochs without improvement.
Early stopping due to no improvement in validation loss.

Plot Validation and Loss Values from Training
  Epoch with best Validation Loss:   49, 0.4265
Generate AUROC/AUPRC for Each Intermediate Model

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0000.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:05<00:00,  2.95it/s]
Loss: 0.5808164440095425
AUROC: 0.8378952555506111
AUPRC: 0.7062493611642052
Sensitivity: 0.7397504456327986
Specificity: 0.7884344146685472
Threshold: 0.16
Accuracy:  0.774633653360283

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.44it/s]
Loss: 0.5410445041126675
AUROC: 0.8319739354876374
AUPRC: 0.6643890084959048
Sensitivity: 0.7491593813046402
Specificity: 0.7741473885046506
Threshold: 0.16
Accuracy:  0.7676056338028169

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0001.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.44it/s]
Loss: 0.6286161690950394
AUROC: 0.8415219145742667
AUPRC: 0.72137895974696
Sensitivity: 0.7736185383244206
Specificity: 0.7531734837799718
Threshold: 0.11
Accuracy:  0.7589691763516928

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.49it/s]
Loss: 0.5806572642591264
AUROC: 0.837064399932574
AUPRC: 0.6786156776962226
Sensitivity: 0.7841291190316073
Specificity: 0.7393274505127594
Threshold: 0.11
Accuracy:  0.7510563380281691

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0002.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.51it/s]
Loss: 0.6103182770311832
AUROC: 0.8425250597738775
AUPRC: 0.7254415979018194
Sensitivity: 0.7522281639928698
Specificity: 0.7764456981664316
Threshold: 0.12
Accuracy:  0.7695805962607377

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.48it/s]
Loss: 0.5680824604299334
AUROC: 0.8382485235343562
AUPRC: 0.6801478119958979
Sensitivity: 0.7726967047747142
Specificity: 0.760076317672311
Threshold: 0.12
Accuracy:  0.7633802816901408

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0003.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.51it/s]
Loss: 0.6218882948160172
AUROC: 0.8422434751564429
AUPRC: 0.7265173684081727
Sensitivity: 0.7950089126559715
Specificity: 0.7334273624823695
Threshold: 0.11
Accuracy:  0.7508842849924204

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.50it/s]
Loss: 0.5717428247133891
AUROC: 0.8372672871540634
AUPRC: 0.6798310712658804
Sensitivity: 0.7505043712172159
Specificity: 0.7779632721202003
Threshold: 0.12
Accuracy:  0.770774647887324

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0004.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
Loss: 0.6282901000231504
AUROC: 0.8426658520825947
AUPRC: 0.7272887981794481
Sensitivity: 0.750445632798574
Specificity: 0.7827926657263752
Threshold: 0.11
Accuracy:  0.7736230419403739

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.48it/s]
Loss: 0.5766062551074558
AUROC: 0.8376711369751777
AUPRC: 0.679876894021313
Sensitivity: 0.7700067249495629
Specificity: 0.7638922012878607
Threshold: 0.11
Accuracy:  0.7654929577464789

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0005.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.50it/s]
Loss: 0.5848910771310329
AUROC: 0.8428518990619712
AUPRC: 0.7253485315780639
Sensitivity: 0.7540106951871658
Specificity: 0.7820874471086037
Threshold: 0.14
Accuracy:  0.7741283476503285

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.48it/s]
Loss: 0.5419933577378591
AUROC: 0.8379441927021225
AUPRC: 0.67977660418192
Sensitivity: 0.7706792199058508
Specificity: 0.7622227522060577
Threshold: 0.14
Accuracy:  0.7644366197183099

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0006.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.49it/s]
Loss: 0.6110657602548599
AUROC: 0.8421089682186504
AUPRC: 0.7251320835869455
Sensitivity: 0.7843137254901961
Specificity: 0.7397743300423131
Threshold: 0.11
Accuracy:  0.752400202122284

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.48it/s]
Loss: 0.5639707644780477
AUROC: 0.8368122744683993
AUPRC: 0.6792161408645484
Sensitivity: 0.7545393409549428
Specificity: 0.7782017648461722
Threshold: 0.12
Accuracy:  0.7720070422535211

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0007.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.51it/s]
Loss: 0.5676175430417061
AUROC: 0.8424156943197847
AUPRC: 0.7233965881264849
Sensitivity: 0.7754010695187166
Specificity: 0.7433004231311706
Threshold: 0.15
Accuracy:  0.752400202122284

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:17<00:00,  2.50it/s]
Loss: 0.5234590699275334
AUROC: 0.8372777121891595
AUPRC: 0.6792979412786524
Sensitivity: 0.7579018157363819
Specificity: 0.7734319103267351
Threshold: 0.16
Accuracy:  0.7693661971830986

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0008.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
Loss: 0.5948936995118856
AUROC: 0.841192561137803
AUPRC: 0.7227709877143108
Sensitivity: 0.7629233511586453
Specificity: 0.7750352609308886
Threshold: 0.12
Accuracy:  0.7716018191005558

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.48it/s]
Loss: 0.5524462699890137
AUROC: 0.8360116317730051
AUPRC: 0.6785588618345426
Sensitivity: 0.7807666442501682
Specificity: 0.7417123777724779
Threshold: 0.12
Accuracy:  0.7519366197183098

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0009.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.51it/s]
Loss: 0.5657588392496109
AUROC: 0.8414263769362086
AUPRC: 0.7226871444730231
Sensitivity: 0.7789661319073083
Specificity: 0.7411847672778561
Threshold: 0.14
Accuracy:  0.7518948964123294

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.49it/s]
Loss: 0.5223377254274156
AUROC: 0.8360406614861191
AUPRC: 0.6780030780128424
Sensitivity: 0.7599193006052455
Specificity: 0.7672310994514667
Threshold: 0.15
Accuracy:  0.7653169014084507

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0010.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
Loss: 0.5331755988299847
AUROC: 0.8412780421823813
AUPRC: 0.7203095942274845
Sensitivity: 0.7575757575757576
Specificity: 0.7785613540197461
Threshold: 0.19
Accuracy:  0.7726124305204649

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.46it/s]
Loss: 0.5007967213789623
AUROC: 0.8356911822326609
AUPRC: 0.6769115746352784
Sensitivity: 0.7760591795561533
Specificity: 0.7486286668256619
Threshold: 0.19
Accuracy:  0.7558098591549296

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0011.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.49it/s]
Loss: 0.5639027561992407
AUROC: 0.8401781022705274
AUPRC: 0.7199371310188074
Sensitivity: 0.7450980392156863
Specificity: 0.7884344146685472
Threshold: 0.15
Accuracy:  0.7761495704901465

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.49it/s]
Loss: 0.5230910082658132
AUROC: 0.8343853262979851
AUPRC: 0.6766180286556339
Sensitivity: 0.7639542703429725
Specificity: 0.7581683758645361
Threshold: 0.15
Accuracy:  0.7596830985915493

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0012.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
Loss: 0.5646946243941784
AUROC: 0.8392302683350554
AUPRC: 0.7193480276795289
Sensitivity: 0.7807486631016043
Specificity: 0.7397743300423131
Threshold: 0.14
Accuracy:  0.751389590702375

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.48it/s]
Loss: 0.525813176896837
AUROC: 0.8335749001081156
AUPRC: 0.6757789015572799
Sensitivity: 0.7572293207800942
Specificity: 0.7615072740281421
Threshold: 0.15
Accuracy:  0.7603873239436619

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0013.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.51it/s]
Loss: 0.5549632534384727
AUROC: 0.8400850787808392
AUPRC: 0.7219098538429185
Sensitivity: 0.7468805704099821
Specificity: 0.7898448519040903
Threshold: 0.16
Accuracy:  0.7776654876200101

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.50it/s]
Loss: 0.5140393343236711
AUROC: 0.8342008833693585
AUPRC: 0.6760031984624799
Sensitivity: 0.7646267652992602
Specificity: 0.7543524922489864
Threshold: 0.16
Accuracy:  0.7570422535211268

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0014.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.49it/s]
Loss: 0.5638203993439674
AUROC: 0.8400511377778449
AUPRC: 0.7237301411574177
Sensitivity: 0.7754010695187166
Specificity: 0.7383638928067701
Threshold: 0.13
Accuracy:  0.7488630621526023

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.48it/s]
Loss: 0.520638124148051
AUROC: 0.834181236187831
AUPRC: 0.6761364465845185
Sensitivity: 0.7673167451244116
Specificity: 0.7502981159074649
Threshold: 0.14
Accuracy:  0.7547535211267605

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0015.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
Loss: 0.5301458425819874
AUROC: 0.8410743961644154
AUPRC: 0.7247621876445975
Sensitivity: 0.7664884135472371
Specificity: 0.7595204513399154
Threshold: 0.19
Accuracy:  0.7614957049014653

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.49it/s]
Loss: 0.492558573351966
AUROC: 0.8351234187828018
AUPRC: 0.6764123407616305
Sensitivity: 0.7552118359112306
Specificity: 0.7662771285475793
Threshold: 0.2
Accuracy:  0.7633802816901408

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0016.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
Loss: 0.5647581201046705
AUROC: 0.8402321564604813
AUPRC: 0.7246863838680128
Sensitivity: 0.7718360071301248
Specificity: 0.7475317348377997
Threshold: 0.13
Accuracy:  0.7544214249621021

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.48it/s]
Loss: 0.5222489502694871
AUROC: 0.8344605469358336
AUPRC: 0.6759104939314866
Sensitivity: 0.7605917955615333
Specificity: 0.7557834486048175
Threshold: 0.14
Accuracy:  0.7570422535211268

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0017.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.50it/s]
Loss: 0.5561709273606539
AUROC: 0.8392233544270381
AUPRC: 0.7226861424516845
Sensitivity: 0.7629233511586453
Specificity: 0.7665726375176305
Threshold: 0.15
Accuracy:  0.7655381505811015

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.49it/s]
Loss: 0.5157758080297046
AUROC: 0.8334401765776406
AUPRC: 0.6747257652434219
Sensitivity: 0.7478143913920645
Specificity: 0.7686620558072979
Threshold: 0.16
Accuracy:  0.7632042253521126

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0018.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.52it/s]
Loss: 0.5484199021011591
AUROC: 0.8404408307751875
AUPRC: 0.7249135576950615
Sensitivity: 0.7736185383244206
Specificity: 0.7468265162200282
Threshold: 0.13
Accuracy:  0.7544214249621021

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.48it/s]
Loss: 0.511433842115932
AUROC: 0.8342929444485164
AUPRC: 0.6753682718716056
Sensitivity: 0.7431069266980498
Specificity: 0.771762461244932
Threshold: 0.15
Accuracy:  0.7642605633802817

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0019.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.52it/s]
Loss: 0.5527403075248003
AUROC: 0.84082046717905
AUPRC: 0.7267870187574753
Sensitivity: 0.768270944741533
Specificity: 0.7496473906911142
Threshold: 0.13
Accuracy:  0.7549267306720566

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.48it/s]
Loss: 0.5137203156948089
AUROC: 0.8343411401876923
AUPRC: 0.6743165033499506
Sensitivity: 0.7700067249495629
Specificity: 0.7543524922489864
Threshold: 0.14
Accuracy:  0.7584507042253521

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0020.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.49it/s]
Loss: 0.5667245667427778
AUROC: 0.841500544313122
AUPRC: 0.7263117049489669
Sensitivity: 0.7700534759358288
Specificity: 0.7588152327221439
Threshold: 0.12
Accuracy:  0.7620010106114199

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.48it/s]
Loss: 0.5251605649789174
AUROC: 0.8356285518295055
AUPRC: 0.675240708890052
Sensitivity: 0.7666442501681238
Specificity: 0.7593608394943954
Threshold: 0.13
Accuracy:  0.7612676056338028

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0021.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
Loss: 0.5085540823638439
AUROC: 0.8413220397788556
AUPRC: 0.7283751534810328
Sensitivity: 0.7611408199643493
Specificity: 0.7496473906911142
Threshold: 0.17
Accuracy:  0.7529055078322385

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.49it/s]
Loss: 0.4792861819267273
AUROC: 0.834548438000953
AUPRC: 0.6743057817612453
Sensitivity: 0.7679892400806994
Specificity: 0.7488671595516336
Threshold: 0.18
Accuracy:  0.7538732394366198

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0022.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
Loss: 0.5051713734865189
AUROC: 0.8439380111577905
AUPRC: 0.7297903966351216
Sensitivity: 0.7593582887700535
Specificity: 0.770098730606488
Threshold: 0.2
Accuracy:  0.7670540677109652

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.48it/s]
Loss: 0.4733465065558751
AUROC: 0.8379270314905025
AUPRC: 0.6754900616251963
Sensitivity: 0.7579018157363819
Specificity: 0.763415215835917
Threshold: 0.21
Accuracy:  0.7619718309859155

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0023.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.50it/s]
Loss: 0.5799041520804167
AUROC: 0.8433245589555222
AUPRC: 0.7294245237512167
Sensitivity: 0.7647058823529411
Specificity: 0.7729196050775741
Threshold: 0.11
Accuracy:  0.7705912076806468

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.50it/s]
Loss: 0.5305726829502317
AUROC: 0.8377518107083074
AUPRC: 0.6756067845880512
Sensitivity: 0.753866845998655
Specificity: 0.7705699976150727
Threshold: 0.12
Accuracy:  0.7661971830985915

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0024.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
Loss: 0.5032052006572485
AUROC: 0.8441894259947856
AUPRC: 0.731963703374374
Sensitivity: 0.7629233511586453
Specificity: 0.7736248236953456
Threshold: 0.19
Accuracy:  0.7705912076806468

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.47it/s]
Loss: 0.47218375934494866
AUROC: 0.8373482816575037
AUPRC: 0.6749640181430733
Sensitivity: 0.7565568258238063
Specificity: 0.7626997376580015
Threshold: 0.2
Accuracy:  0.7610915492957746

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0025.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
Loss: 0.4950281009078026
AUROC: 0.8441102303211322
AUPRC: 0.731444071827037
Sensitivity: 0.7664884135472371
Specificity: 0.7609308885754584
Threshold: 0.18
Accuracy:  0.7625063163213744

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.49it/s]
Loss: 0.4621902300251855
AUROC: 0.8377944731596244
AUPRC: 0.6757962442433705
Sensitivity: 0.7673167451244116
Specificity: 0.7498211304555211
Threshold: 0.19
Accuracy:  0.7544014084507042

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0026.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.51it/s]
Loss: 0.5062575042247772
AUROC: 0.8453874176930678
AUPRC: 0.7323299751378214
Sensitivity: 0.7700534759358288
Specificity: 0.7538787023977433
Threshold: 0.17
Accuracy:  0.7584638706417383

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.48it/s]
Loss: 0.4719485292832057
AUROC: 0.8397267935110089
AUPRC: 0.6766070127094252
Sensitivity: 0.7525218560860794
Specificity: 0.7677080849034105
Threshold: 0.19
Accuracy:  0.7637323943661972

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0027.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.51it/s]
Loss: 0.5869141556322575
AUROC: 0.842807901465497
AUPRC: 0.7299497810823878
Sensitivity: 0.7771836007130125
Specificity: 0.7581100141043724
Threshold: 0.09
Accuracy:  0.7635169277412834

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.50it/s]
Loss: 0.5418211708466212
AUROC: 0.8363343267055238
AUPRC: 0.6730733763688754
Sensitivity: 0.7666442501681238
Specificity: 0.7562604340567612
Threshold: 0.1
Accuracy:  0.7589788732394366

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0028.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.51it/s]
Loss: 0.4751395992934704
AUROC: 0.8444559257220006
AUPRC: 0.7311369042794256
Sensitivity: 0.7700534759358288
Specificity: 0.7602256699576869
Threshold: 0.18
Accuracy:  0.763011622031329

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.48it/s]
Loss: 0.4474008798599243
AUROC: 0.8375314415048875
AUPRC: 0.6759980747531694
Sensitivity: 0.7518493611297915
Specificity: 0.7626997376580015
Threshold: 0.2
Accuracy:  0.7598591549295775

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0029.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.50it/s]
Loss: 0.5680038742721081
AUROC: 0.8456690023105023
AUPRC: 0.7332500456292966
Sensitivity: 0.7771836007130125
Specificity: 0.7595204513399154
Threshold: 0.1
Accuracy:  0.7645275391611925

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.48it/s]
Loss: 0.5151646998193529
AUROC: 0.8407806041740878
AUPRC: 0.6775228585657982
Sensitivity: 0.7531943510423672
Specificity: 0.7722394466968757
Threshold: 0.12
Accuracy:  0.7672535211267606

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0030.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.49it/s]
Loss: 0.5911647342145443
AUROC: 0.8466294069878241
AUPRC: 0.7352037974509527
Sensitivity: 0.7593582887700535
Specificity: 0.7778561354019746
Threshold: 0.09
Accuracy:  0.7726124305204649

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.48it/s]
Loss: 0.540356284711096
AUROC: 0.8424757950733208
AUPRC: 0.6789309696087095
Sensitivity: 0.7592468056489576
Specificity: 0.7731934176007632
Threshold: 0.1
Accuracy:  0.7695422535211267

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0031.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.51it/s]
Loss: 0.5051045175641775
AUROC: 0.8441014308018373
AUPRC: 0.7295867062606514
Sensitivity: 0.7629233511586453
Specificity: 0.7757404795486601
Threshold: 0.15
Accuracy:  0.7721071248105104

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.50it/s]
Loss: 0.47170162002245586
AUROC: 0.8375131575971801
AUPRC: 0.6761909921417513
Sensitivity: 0.7619367854741089
Specificity: 0.7560219413307894
Threshold: 0.16
Accuracy:  0.7575704225352112

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0032.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
Loss: 0.505717720836401
AUROC: 0.8429336088839947
AUPRC: 0.7296225168188023
Sensitivity: 0.768270944741533
Specificity: 0.7574047954866009
Threshold: 0.19
Accuracy:  0.7604850934815564

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.48it/s]
Loss: 0.46928496261437735
AUROC: 0.8391980036538946
AUPRC: 0.6770596001235714
Sensitivity: 0.7619367854741089
Specificity: 0.7586453613164799
Threshold: 0.2
Accuracy:  0.7595070422535212

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0033.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.52it/s]
Loss: 0.502472223713994
AUROC: 0.8440310346474786
AUPRC: 0.7317653408092115
Sensitivity: 0.7664884135472371
Specificity: 0.7616361071932299
Threshold: 0.2
Accuracy:  0.763011622031329

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.49it/s]
Loss: 0.470398810505867
AUROC: 0.8393545395654942
AUPRC: 0.6758113566694667
Sensitivity: 0.7558843308675185
Specificity: 0.7638922012878607
Threshold: 0.21
Accuracy:  0.7617957746478873

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0034.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
Loss: 0.4772149585187435
AUROC: 0.8464031336345283
AUPRC: 0.7310900615702502
Sensitivity: 0.7664884135472371
Specificity: 0.7771509167842031
Threshold: 0.19
Accuracy:  0.7741283476503285

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.49it/s]
Loss: 0.44746097955438824
AUROC: 0.8418632841651255
AUPRC: 0.678820734538399
Sensitivity: 0.7605917955615333
Specificity: 0.7619842594800859
Threshold: 0.2
Accuracy:  0.7616197183098592

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0035.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.49it/s]
Loss: 0.46521933749318123
AUROC: 0.8442661075200693
AUPRC: 0.7282797013531678
Sensitivity: 0.7486631016042781
Specificity: 0.7799717912552891
Threshold: 0.22
Accuracy:  0.7710965133906014

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.50it/s]
Loss: 0.43948978583017984
AUROC: 0.8421557464958649
AUPRC: 0.6788311839934166
Sensitivity: 0.7706792199058508
Specificity: 0.7588838540424517
Threshold: 0.22
Accuracy:  0.7619718309859155

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0036.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.49it/s]
Loss: 0.49091689847409725
AUROC: 0.8445602628793535
AUPRC: 0.7237045854437764
Sensitivity: 0.7664884135472371
Specificity: 0.7609308885754584
Threshold: 0.19
Accuracy:  0.7625063163213744

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.48it/s]
Loss: 0.4548961020178265
AUROC: 0.8415518161934797
AUPRC: 0.6777799458454066
Sensitivity: 0.7679892400806994
Specificity: 0.7610302885761985
Threshold: 0.2
Accuracy:  0.7628521126760563

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0037.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.50it/s]
Loss: 0.5434265956282616
AUROC: 0.8469851589821722
AUPRC: 0.7336399872977093
Sensitivity: 0.7557932263814616
Specificity: 0.7792665726375176
Threshold: 0.11
Accuracy:  0.7726124305204649

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.49it/s]
Loss: 0.49795025057262843
AUROC: 0.8425689788485662
AUPRC: 0.6794098751291822
Sensitivity: 0.7605917955615333
Specificity: 0.7696160267111853
Threshold: 0.12
Accuracy:  0.7672535211267606

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0038.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.51it/s]
Loss: 0.49342846125364304
AUROC: 0.8459380161860872
AUPRC: 0.7302145714054916
Sensitivity: 0.7575757575757576
Specificity: 0.7799717912552891
Threshold: 0.19
Accuracy:  0.7736230419403739

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.46it/s]
Loss: 0.45606137481000686
AUROC: 0.8429888671852133
AUPRC: 0.6801835028686807
Sensitivity: 0.773369199731002
Specificity: 0.7548294777009301
Threshold: 0.19
Accuracy:  0.7596830985915493

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0039.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.51it/s]
Loss: 0.4634505230933428
AUROC: 0.8439329828610505
AUPRC: 0.7287757856808748
Sensitivity: 0.7486631016042781
Specificity: 0.7842031029619182
Threshold: 0.25
Accuracy:  0.7741283476503285

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.49it/s]
Loss: 0.43938931624094646
AUROC: 0.8408713821720031
AUPRC: 0.678120596786827
Sensitivity: 0.769334229993275
Specificity: 0.7548294777009301
Threshold: 0.25
Accuracy:  0.7586267605633803

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0040.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.46it/s]
Loss: 0.5257743988186121
AUROC: 0.8447965928261291
AUPRC: 0.7218195022140583
Sensitivity: 0.7700534759358288
Specificity: 0.765867418899859
Threshold: 0.14
Accuracy:  0.7670540677109652

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.47it/s]
Loss: 0.4808568060398102
AUROC: 0.8425065890231436
AUPRC: 0.6768179500949117
Sensitivity: 0.773369199731002
Specificity: 0.7588838540424517
Threshold: 0.15
Accuracy:  0.7626760563380282

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0041.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.51it/s]
Loss: 0.5226410962641239
AUROC: 0.8450681208500838
AUPRC: 0.7301745502850309
Sensitivity: 0.768270944741533
Specificity: 0.7609308885754584
Threshold: 0.15
Accuracy:  0.763011622031329

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.49it/s]
Loss: 0.4827634490198559
AUROC: 0.8415530992747222
AUPRC: 0.6775153231248328
Sensitivity: 0.773369199731002
Specificity: 0.7562604340567612
Threshold: 0.16
Accuracy:  0.7607394366197183

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0042.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
Loss: 0.46953093260526657
AUROC: 0.8432051369079494
AUPRC: 0.7275289488275865
Sensitivity: 0.7522281639928698
Specificity: 0.7764456981664316
Threshold: 0.22
Accuracy:  0.7695805962607377

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.48it/s]
Loss: 0.4439319137069914
AUROC: 0.8410752317044242
AUPRC: 0.6772220003001221
Sensitivity: 0.773369199731002
Specificity: 0.753398521345099
Threshold: 0.22
Accuracy:  0.7586267605633803

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0043.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.51it/s]
Loss: 0.5182556118816137
AUROC: 0.8477016912676083
AUPRC: 0.7343785374060654
Sensitivity: 0.7647058823529411
Specificity: 0.7792665726375176
Threshold: 0.12
Accuracy:  0.7751389590702374

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.48it/s]
Loss: 0.47507025566365985
AUROC: 0.843736502586772
AUPRC: 0.6803797425621928
Sensitivity: 0.7673167451244116
Specificity: 0.7660386358216075
Threshold: 0.13
Accuracy:  0.7663732394366197

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0044.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
Loss: 0.5012157391756773
AUROC: 0.8469575033501027
AUPRC: 0.7318941510339284
Sensitivity: 0.7754010695187166
Specificity: 0.7595204513399154
Threshold: 0.11
Accuracy:  0.764022233451238

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.48it/s]
Loss: 0.46614760690265233
AUROC: 0.8438028218485
AUPRC: 0.6825915566603222
Sensitivity: 0.7626092804303968
Specificity: 0.7629382303839732
Threshold: 0.13
Accuracy:  0.7628521126760563

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0045.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.51it/s]
Loss: 0.4493454024195671
AUROC: 0.846609922337957
AUPRC: 0.7309247833426618
Sensitivity: 0.7664884135472371
Specificity: 0.7771509167842031
Threshold: 0.22
Accuracy:  0.7741283476503285

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.49it/s]
Loss: 0.4246538817882538
AUROC: 0.8437478097402225
AUPRC: 0.6799881569204674
Sensitivity: 0.7552118359112306
Specificity: 0.7715239685189602
Threshold: 0.23
Accuracy:  0.7672535211267606

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0046.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
Loss: 0.44881695322692394
AUROC: 0.8457821389871503
AUPRC: 0.7294035412494164
Sensitivity: 0.7593582887700535
Specificity: 0.7785613540197461
Threshold: 0.22
Accuracy:  0.7731177362304194

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.48it/s]
Loss: 0.42165348629156746
AUROC: 0.8434780419089618
AUPRC: 0.6779003669810428
Sensitivity: 0.753866845998655
Specificity: 0.7703315048891008
Threshold: 0.23
Accuracy:  0.7660211267605633

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0047.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.50it/s]
Loss: 0.48098517395555973
AUROC: 0.8458148229159594
AUPRC: 0.7296931827037744
Sensitivity: 0.7647058823529411
Specificity: 0.7743300423131171
Threshold: 0.16
Accuracy:  0.7716018191005558

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.49it/s]
Loss: 0.4450910558303197
AUROC: 0.842675153821393
AUPRC: 0.6802044288941403
Sensitivity: 0.7659717552118359
Specificity: 0.7595993322203672
Threshold: 0.17
Accuracy:  0.7612676056338028

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0048.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
Loss: 0.5043739173561335
AUROC: 0.845372332802848
AUPRC: 0.7292207677603586
Sensitivity: 0.768270944741533
Specificity: 0.7651622002820875
Threshold: 0.12
Accuracy:  0.7660434562910561

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.48it/s]
Loss: 0.4644781708717346
AUROC: 0.8411158893412998
AUPRC: 0.6794237419984895
Sensitivity: 0.7585743106926698
Specificity: 0.7624612449320296
Threshold: 0.14
Accuracy:  0.761443661971831

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0049.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.50it/s]
Loss: 0.42674148827791214
AUROC: 0.8463930770410485
AUPRC: 0.7284060976369059
Sensitivity: 0.7647058823529411
Specificity: 0.7743300423131171
Threshold: 0.25
Accuracy:  0.7716018191005558

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.50it/s]
Loss: 0.41356296804216175
AUROC: 0.8434853394335293
AUPRC: 0.678605731944286
Sensitivity: 0.7659717552118359
Specificity: 0.7660386358216075
Threshold: 0.26
Accuracy:  0.7660211267605633

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0050.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
Loss: 0.6090516299009323
AUROC: 0.8462170866551518
AUPRC: 0.7317505532182749
Sensitivity: 0.7700534759358288
Specificity: 0.7708039492242595
Threshold: 0.06
Accuracy:  0.7705912076806468

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.48it/s]
Loss: 0.542758755882581
AUROC: 0.843559036412402
AUPRC: 0.6812475187984902
Sensitivity: 0.7592468056489576
Specificity: 0.7708084903410446
Threshold: 0.07
Accuracy:  0.7677816901408451

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0051.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.52it/s]
Loss: 0.5332378298044205
AUROC: 0.8470517839139758
AUPRC: 0.7332419248571193
Sensitivity: 0.7789661319073083
Specificity: 0.7616361071932299
Threshold: 0.09
Accuracy:  0.7665487620010106

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.49it/s]
Loss: 0.48932356900639007
AUROC: 0.8434522198989541
AUPRC: 0.6792637499369341
Sensitivity: 0.7740416946872899
Specificity: 0.7576913904125924
Threshold: 0.1
Accuracy:  0.7619718309859155

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0052.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.51it/s]
Loss: 0.4921722449362278
AUROC: 0.8476501512260244
AUPRC: 0.7329907178596754
Sensitivity: 0.7664884135472371
Specificity: 0.763046544428773
Threshold: 0.14
Accuracy:  0.764022233451238

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.47it/s]
Loss: 0.45695666538344487
AUROC: 0.84388622212927
AUPRC: 0.6806309601087861
Sensitivity: 0.7659717552118359
Specificity: 0.7595993322203672
Threshold: 0.16
Accuracy:  0.7612676056338028

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0053.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
Loss: 0.45928443782031536
AUROC: 0.8476790639322789
AUPRC: 0.7313358213505751
Sensitivity: 0.7664884135472371
Specificity: 0.765867418899859
Threshold: 0.18
Accuracy:  0.7660434562910561

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.50it/s]
Loss: 0.42852962348196244
AUROC: 0.8433096374958681
AUPRC: 0.6796453738269175
Sensitivity: 0.7646267652992602
Specificity: 0.7655616503696637
Threshold: 0.2
Accuracy:  0.7653169014084507

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0054.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.49it/s]
Loss: 0.4578698016703129
AUROC: 0.8467576285546916
AUPRC: 0.7298741142149887
Sensitivity: 0.7771836007130125
Specificity: 0.7588152327221439
Threshold: 0.17
Accuracy:  0.764022233451238

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.48it/s]
Loss: 0.4327228933572769
AUROC: 0.843121746286402
AUPRC: 0.6782068366270517
Sensitivity: 0.7632817753866846
Specificity: 0.7612687813021702
Threshold: 0.19
Accuracy:  0.7617957746478873

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0055.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.51it/s]
Loss: 0.44661819748580456
AUROC: 0.8458651058833586
AUPRC: 0.7275879508875388
Sensitivity: 0.7754010695187166
Specificity: 0.7574047954866009
Threshold: 0.26
Accuracy:  0.7625063163213744

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.49it/s]
Loss: 0.4275800863901774
AUROC: 0.8432029813675754
AUPRC: 0.6792750661674245
Sensitivity: 0.7740416946872899
Specificity: 0.7541139995230145
Threshold: 0.27
Accuracy:  0.759330985915493

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0056.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.51it/s]
Loss: 0.4550197683274746
AUROC: 0.8481014408584308
AUPRC: 0.7305359436232469
Sensitivity: 0.7754010695187166
Specificity: 0.7559943582510579
Threshold: 0.2
Accuracy:  0.7614957049014653

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.48it/s]
Loss: 0.4261281172434489
AUROC: 0.8446996956370909
AUPRC: 0.6805039796234126
Sensitivity: 0.7599193006052455
Specificity: 0.7727164321488195
Threshold: 0.22
Accuracy:  0.7693661971830986

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0057.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.50it/s]
Loss: 0.5131444986909628
AUROC: 0.8470065292433169
AUPRC: 0.7339173360496939
Sensitivity: 0.7754010695187166
Specificity: 0.7602256699576869
Threshold: 0.13
Accuracy:  0.7645275391611925

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.47it/s]
Loss: 0.468035982714759
AUROC: 0.8433413937566229
AUPRC: 0.679953166274347
Sensitivity: 0.769334229993275
Specificity: 0.761745766754114
Threshold: 0.15
Accuracy:  0.7637323943661972

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0058.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.50it/s]
Loss: 0.5290234535932541
AUROC: 0.8471561210713289
AUPRC: 0.7345456992170544
Sensitivity: 0.768270944741533
Specificity: 0.770098730606488
Threshold: 0.09
Accuracy:  0.7695805962607377

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.48it/s]
Loss: 0.4917702105310228
AUROC: 0.8428269583709103
AUPRC: 0.6802457985301114
Sensitivity: 0.7652992602555481
Specificity: 0.7610302885761985
Threshold: 0.1
Accuracy:  0.7621478873239437

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0059.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.52it/s]
Loss: 0.4571169950067997
AUROC: 0.846263598399996
AUPRC: 0.7293878922771273
Sensitivity: 0.7629233511586453
Specificity: 0.764456981664316
Threshold: 0.23
Accuracy:  0.764022233451238

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.50it/s]
Loss: 0.43508988188372716
AUROC: 0.8414578304924578
AUPRC: 0.6776268760963118
Sensitivity: 0.7652992602555481
Specificity: 0.761745766754114
Threshold: 0.24
Accuracy:  0.7626760563380282

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0060.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.51it/s]
Loss: 0.4327222555875778
AUROC: 0.8466495201747836
AUPRC: 0.7303007107820702
Sensitivity: 0.7754010695187166
Specificity: 0.7574047954866009
Threshold: 0.2
Accuracy:  0.7625063163213744

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.48it/s]
Loss: 0.41457836627960204
AUROC: 0.8443361024899635
AUPRC: 0.6800185055106931
Sensitivity: 0.7592468056489576
Specificity: 0.766754113999523
Threshold: 0.22
Accuracy:  0.7647887323943662

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0061.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.50it/s]
Loss: 0.6126507334411144
AUROC: 0.8451988565653213
AUPRC: 0.7281115789883683
Sensitivity: 0.768270944741533
Specificity: 0.770098730606488
Threshold: 0.06
Accuracy:  0.7695805962607377

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.50it/s]
Loss: 0.5602702293131087
AUROC: 0.8418777188291049
AUPRC: 0.6776313735439492
Sensitivity: 0.7511768661735037
Specificity: 0.7751013594085381
Threshold: 0.07
Accuracy:  0.768838028169014

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0062.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.50it/s]
Loss: 0.4259168580174446
AUROC: 0.8458651058833586
AUPRC: 0.7299495586474407
Sensitivity: 0.7736185383244206
Specificity: 0.7566995768688294
Threshold: 0.28
Accuracy:  0.7614957049014653

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.48it/s]
Loss: 0.42046640382872685
AUROC: 0.8427550256287459
AUPRC: 0.6792671337638441
Sensitivity: 0.7639542703429725
Specificity: 0.7660386358216075
Threshold: 0.3
Accuracy:  0.7654929577464789

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0063.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
Loss: 0.5409273337572813
AUROC: 0.8451309745593327
AUPRC: 0.7313397889495351
Sensitivity: 0.7575757575757576
Specificity: 0.7771509167842031
Threshold: 0.09
Accuracy:  0.7716018191005558

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.50it/s]
Loss: 0.5010275317562951
AUROC: 0.8405495693578386
AUPRC: 0.6778769567329911
Sensitivity: 0.7592468056489576
Specificity: 0.7624612449320296
Threshold: 0.1
Accuracy:  0.7616197183098592

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0064.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.50it/s]
Loss: 0.46851047687232494
AUROC: 0.8484408508883744
AUPRC: 0.7321288632708871
Sensitivity: 0.7754010695187166
Specificity: 0.7574047954866009
Threshold: 0.16
Accuracy:  0.7625063163213744

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.48it/s]
Loss: 0.43987531661987306
AUROC: 0.845767700386416
AUPRC: 0.6822823199263945
Sensitivity: 0.7700067249495629
Specificity: 0.7658001430956356
Threshold: 0.18
Accuracy:  0.7669014084507042

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0065.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.50it/s]
Loss: 0.49199312925338745
AUROC: 0.8488016311794623
AUPRC: 0.7329473330825662
Sensitivity: 0.7629233511586453
Specificity: 0.771509167842031
Threshold: 0.13
Accuracy:  0.7690752905507833

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.49it/s]
Loss: 0.4563362840149138
AUROC: 0.8461053111383803
AUPRC: 0.6809830131576231
Sensitivity: 0.7713517148621385
Specificity: 0.760076317672311
Threshold: 0.14
Accuracy:  0.7630281690140845

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0066.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.50it/s]
Loss: 0.5556049328297377
AUROC: 0.8469260764954782
AUPRC: 0.7273562512837821
Sensitivity: 0.7664884135472371
Specificity: 0.7743300423131171
Threshold: 0.09
Accuracy:  0.7721071248105104

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.45it/s]
Loss: 0.5069198489189148
AUROC: 0.8436634471485204
AUPRC: 0.6790900815267988
Sensitivity: 0.7646267652992602
Specificity: 0.7677080849034105
Threshold: 0.1
Accuracy:  0.7669014084507042

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0067.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
Loss: 0.5434247124940157
AUROC: 0.8466457489522287
AUPRC: 0.733012192217917
Sensitivity: 0.7754010695187166
Specificity: 0.7581100141043724
Threshold: 0.11
Accuracy:  0.763011622031329

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.49it/s]
Loss: 0.4952555408080419
AUROC: 0.8431215057086691
AUPRC: 0.6780862521672918
Sensitivity: 0.773369199731002
Specificity: 0.7562604340567612
Threshold: 0.12
Accuracy:  0.7607394366197183

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0068.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.51it/s]
Loss: 0.5488560330122709
AUROC: 0.8451246891884078
AUPRC: 0.7263926777072109
Sensitivity: 0.7754010695187166
Specificity: 0.7595204513399154
Threshold: 0.11
Accuracy:  0.764022233451238

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.46it/s]
Loss: 0.5045998374621073
AUROC: 0.8412982472629071
AUPRC: 0.6745665457065393
Sensitivity: 0.7726967047747142
Specificity: 0.7555449558788457
Threshold: 0.12
Accuracy:  0.7600352112676056

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0069.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.50it/s]
Loss: 0.5266331508755684
AUROC: 0.8466080367266794
AUPRC: 0.7307424344171782
Sensitivity: 0.768270944741533
Specificity: 0.763046544428773
Threshold: 0.12
Accuracy:  0.7645275391611925

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.49it/s]
Loss: 0.4872579528225793
AUROC: 0.8406709809204215
AUPRC: 0.6758827249459759
Sensitivity: 0.7612642905178211
Specificity: 0.7681850703553542
Threshold: 0.14
Accuracy:  0.7663732394366197


Plot AUROC/AUPRC for Each Intermediate Model
  Epoch with best Validation Loss:      49, 0.4265
  Epoch with best model Test AUROC:     65, 0.8461
  Epoch with best model Test Accuracy:   6, 0.772

AUROC/AUPRC Plots - Best Model Based on Validation Loss
  Epoch with best Validation Loss:   49, 0.4265
  Best Model Based on Validation Loss:
    ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0049.model

Generate Stats Based on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.47it/s]
Loss: 0.41356296804216175
AUROC: 0.8434853394335293
AUPRC: 0.678605731944286
Sensitivity: 0.7659717552118359
Specificity: 0.7660386358216075
Threshold: 0.26
Accuracy:  0.7660211267605633
best_model_val_test_auroc: 0.8434853394335293
best_model_val_test_auprc: 0.678605731944286

AUROC/AUPRC Plots - Best Model Based on Model AUROC
  Epoch with best model Test AUROC:  65, 0.8461
  Best Model Based on Model AUROC:
    ./vitaldb_cache/models/ABP_EEG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_3ec9ab20_0065.model

Generate Stats Based on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.48it/s]
Loss: 0.4563362840149138
AUROC: 0.8461053111383803
AUPRC: 0.6809830131576231
Sensitivity: 0.7713517148621385
Specificity: 0.760076317672311
Threshold: 0.14
Accuracy:  0.7630281690140845
best_model_auroc_test_auroc: 0.8461053111383803
best_model_auroc_test_auprc: 0.6809830131576231

Total Processing Time: 5649.1750 sec
In [110]:
RUN_ME = False
DISPLAY_MODEL_PREDICTION=True
DISPLAY_MODEL_PREDICTION_FIRST_ONLY=True

if MULTI_RUN and RUN_ME:
    (model, best_model_val_loss, best_model_auroc, experimentName) = run_experiment(
        experimentNamePrefix=None, 
        useAbp=False, 
        useEeg=True, 
        useEcg=True,
        nResiduals=12, 
        skip_connection=False,
        batch_size=128,
        learning_rate=1e-4,
        weight_decay=1e-1,
        balance_labels=False,
        #pos_weight=2.0,
        pos_weight=None,
        max_epochs=MAX_EPOCHS,
        patience=PATIENCE,
        device=device
    )
    
    if DISPLAY_MODEL_PREDICTION:
        for case_id_to_check in my_cases_of_interest_idx:
            preds = predictionsForModel(case_id_to_check, model, best_model_val_loss, device)
            printModelPrediction(case_id_to_check, positiveSegmentsMap, 
                            negativeSegmentsMap, iohEventsMap, cleanEventsMap, preds, experimentName)

            if DISPLAY_MODEL_PREDICTION_FIRST_ONLY:
                break
In [111]:
RUN_ME = True
DISPLAY_MODEL_PREDICTION=True
DISPLAY_MODEL_PREDICTION_FIRST_ONLY=True

if MULTI_RUN and RUN_ME:
    (model, best_model_val_loss, best_model_auroc, experimentName) = run_experiment(
        experimentNamePrefix=None, 
        useAbp=True, 
        useEeg=True, 
        useEcg=True,
        nResiduals=12, 
        skip_connection=False,
        batch_size=128,
        learning_rate=1e-4,
        weight_decay=1e-1,
        balance_labels=False,
        #pos_weight=2.0,
        pos_weight=None,
        max_epochs=MAX_EPOCHS,
        patience=PATIENCE,
        device=device
    )
    
    if DISPLAY_MODEL_PREDICTION:
        for case_id_to_check in my_cases_of_interest_idx:
            preds = predictionsForModel(case_id_to_check, model, best_model_val_loss, device)
            printModelPrediction(case_id_to_check, positiveSegmentsMap, 
                            negativeSegmentsMap, iohEventsMap, cleanEventsMap, preds, experimentName)

            if DISPLAY_MODEL_PREDICTION_FIRST_ONLY:
                break
Experiment Setup
  name:              ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e
  prediction_window: 003
  max_cases:         _ALL
  use_abp:           True
  use_eeg:           True
  use_ecg:           True
  n_residuals:       12
  skip_connection:   False
  batch_size:        128
  learning_rate:     0.0001
  weight_decay:      0.1
  balance_labels:    False
  max_epochs:        200
  patience:          20
  device:            mps

Model Architecture
HypotensionCNN(
  (abpResiduals): Sequential(
    (0): ResidualBlock(
      (bn1): BatchNorm1d(1, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(1, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (bn2): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (residualConv): Conv1d(1, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    )
    (1): ResidualBlock(
      (bn1): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (bn2): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (residualConv): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
    )
    (2): ResidualBlock(
      (bn1): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (bn2): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (residualConv): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    )
    (3): ResidualBlock(
      (bn1): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (bn2): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (residualConv): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
    )
    (4): ResidualBlock(
      (bn1): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (bn2): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (residualConv): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    )
    (5): ResidualBlock(
      (bn1): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(2, 4, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (bn2): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(4, 4, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (residualConv): Conv1d(2, 4, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
    )
    (6): ResidualBlock(
      (bn1): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    )
    (7): ResidualBlock(
      (bn1): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
    )
    (8): ResidualBlock(
      (bn1): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=1, dilation=1, ceil_mode=False)
    )
    (9): ResidualBlock(
      (bn1): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
    )
    (10): ResidualBlock(
      (bn1): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(4, 6, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(6, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(6, 6, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(4, 6, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    )
    (11): ResidualBlock(
      (bn1): BatchNorm1d(6, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(6, 6, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(6, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(6, 6, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(6, 6, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
    )
  )
  (abpFc): Linear(in_features=2814, out_features=32, bias=True)
  (ecgResiduals): Sequential(
    (0): ResidualBlock(
      (bn1): BatchNorm1d(1, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(1, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (bn2): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (residualConv): Conv1d(1, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    )
    (1): ResidualBlock(
      (bn1): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (bn2): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (residualConv): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
    )
    (2): ResidualBlock(
      (bn1): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (bn2): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (residualConv): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    )
    (3): ResidualBlock(
      (bn1): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (bn2): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (residualConv): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
    )
    (4): ResidualBlock(
      (bn1): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (bn2): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (residualConv): Conv1d(2, 2, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    )
    (5): ResidualBlock(
      (bn1): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(2, 4, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (bn2): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(4, 4, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
      (residualConv): Conv1d(2, 4, kernel_size=(15,), stride=(1,), padding=(7,), bias=False)
    )
    (6): ResidualBlock(
      (bn1): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    )
    (7): ResidualBlock(
      (bn1): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
    )
    (8): ResidualBlock(
      (bn1): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=1, dilation=1, ceil_mode=False)
    )
    (9): ResidualBlock(
      (bn1): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
    )
    (10): ResidualBlock(
      (bn1): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(4, 6, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(6, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(6, 6, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(4, 6, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    )
    (11): ResidualBlock(
      (bn1): BatchNorm1d(6, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(6, 6, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(6, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(6, 6, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(6, 6, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
    )
  )
  (ecgFc): Linear(in_features=2814, out_features=32, bias=True)
  (eegResiduals): Sequential(
    (0): ResidualBlock(
      (bn1): BatchNorm1d(1, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(1, 2, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(2, 2, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(1, 2, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    )
    (1): ResidualBlock(
      (bn1): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(2, 2, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(2, 2, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(2, 2, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
    )
    (2): ResidualBlock(
      (bn1): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(2, 2, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(2, 2, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(2, 2, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    )
    (3): ResidualBlock(
      (bn1): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(2, 2, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(2, 2, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(2, 2, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
    )
    (4): ResidualBlock(
      (bn1): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(2, 2, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(2, 2, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(2, 2, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    )
    (5): ResidualBlock(
      (bn1): BatchNorm1d(2, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(2, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (bn2): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(4, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
      (residualConv): Conv1d(2, 4, kernel_size=(7,), stride=(1,), padding=(3,), bias=False)
    )
    (6): ResidualBlock(
      (bn1): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(4, 4, kernel_size=(3,), stride=(1,), padding=(1,), bias=False)
      (bn2): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(4, 4, kernel_size=(3,), stride=(1,), padding=(1,), bias=False)
      (residualConv): Conv1d(4, 4, kernel_size=(3,), stride=(1,), padding=(1,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    )
    (7): ResidualBlock(
      (bn1): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(4, 4, kernel_size=(3,), stride=(1,), padding=(1,), bias=False)
      (bn2): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(4, 4, kernel_size=(3,), stride=(1,), padding=(1,), bias=False)
      (residualConv): Conv1d(4, 4, kernel_size=(3,), stride=(1,), padding=(1,), bias=False)
    )
    (8): ResidualBlock(
      (bn1): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(4, 4, kernel_size=(3,), stride=(1,), padding=(1,), bias=False)
      (bn2): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(4, 4, kernel_size=(3,), stride=(1,), padding=(1,), bias=False)
      (residualConv): Conv1d(4, 4, kernel_size=(3,), stride=(1,), padding=(1,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    )
    (9): ResidualBlock(
      (bn1): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(4, 4, kernel_size=(3,), stride=(1,), padding=(1,), bias=False)
      (bn2): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(4, 4, kernel_size=(3,), stride=(1,), padding=(1,), bias=False)
      (residualConv): Conv1d(4, 4, kernel_size=(3,), stride=(1,), padding=(1,), bias=False)
    )
    (10): ResidualBlock(
      (bn1): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(4, 6, kernel_size=(3,), stride=(1,), padding=(1,), bias=False)
      (bn2): BatchNorm1d(6, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(6, 6, kernel_size=(3,), stride=(1,), padding=(1,), bias=False)
      (residualConv): Conv1d(4, 6, kernel_size=(3,), stride=(1,), padding=(1,), bias=False)
      (downsample): MaxPool1d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False)
    )
    (11): ResidualBlock(
      (bn1): BatchNorm1d(6, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU()
      (dropout): Dropout(p=0.5, inplace=False)
      (conv1): Conv1d(6, 6, kernel_size=(3,), stride=(1,), padding=(1,), bias=False)
      (bn2): BatchNorm1d(6, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv1d(6, 6, kernel_size=(3,), stride=(1,), padding=(1,), bias=False)
      (residualConv): Conv1d(6, 6, kernel_size=(3,), stride=(1,), padding=(1,), bias=False)
    )
  )
  (eegFc): Linear(in_features=720, out_features=32, bias=True)
  (fullLinear1): Linear(in_features=96, out_features=16, bias=True)
  (fullLinear2): Linear(in_features=16, out_features=1, bias=True)
  (sigmoid): Sigmoid()
)

Training Loop
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:58<00:00,  1.58it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.31it/s]
[2024-05-05 08:54:27.918560] Completed epoch 0 with training loss 0.52639765, validation loss 0.58074337
Validation loss improved to 0.58074337. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:58<00:00,  1.58it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.39it/s]
[2024-05-05 08:55:32.854922] Completed epoch 1 with training loss 0.44737455, validation loss 0.58147371
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:58<00:00,  1.58it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.44it/s]
[2024-05-05 08:56:37.572009] Completed epoch 2 with training loss 0.44076547, validation loss 0.73376203
No improvement in validation loss. 2 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:58<00:00,  1.59it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.43it/s]
[2024-05-05 08:57:42.303967] Completed epoch 3 with training loss 0.43887880, validation loss 0.67034256
No improvement in validation loss. 3 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.60it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.46it/s]
[2024-05-05 08:58:46.321993] Completed epoch 4 with training loss 0.43838891, validation loss 0.55006850
Validation loss improved to 0.55006850. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.60it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.43it/s]
[2024-05-05 08:59:50.570358] Completed epoch 5 with training loss 0.43939614, validation loss 0.56090671
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.60it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.45it/s]
[2024-05-05 09:00:54.781023] Completed epoch 6 with training loss 0.43851224, validation loss 0.55608582
No improvement in validation loss. 2 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.60it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.45it/s]
[2024-05-05 09:01:59.037168] Completed epoch 7 with training loss 0.43940637, validation loss 0.53840899
Validation loss improved to 0.53840899. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.60it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.44it/s]
[2024-05-05 09:03:03.396066] Completed epoch 8 with training loss 0.43743899, validation loss 0.57760209
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.60it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.45it/s]
[2024-05-05 09:04:07.612964] Completed epoch 9 with training loss 0.43525392, validation loss 0.54842627
No improvement in validation loss. 2 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.60it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.45it/s]
[2024-05-05 09:05:11.769959] Completed epoch 10 with training loss 0.43928918, validation loss 0.57664877
No improvement in validation loss. 3 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.59it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.45it/s]
[2024-05-05 09:06:16.110635] Completed epoch 11 with training loss 0.43535537, validation loss 0.55780345
No improvement in validation loss. 4 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.60it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.44it/s]
[2024-05-05 09:07:20.409932] Completed epoch 12 with training loss 0.43659085, validation loss 0.61490941
No improvement in validation loss. 5 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.59it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.44it/s]
[2024-05-05 09:08:24.784590] Completed epoch 13 with training loss 0.44054911, validation loss 0.57612604
No improvement in validation loss. 6 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.60it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.45it/s]
[2024-05-05 09:09:28.991792] Completed epoch 14 with training loss 0.43614969, validation loss 0.56053460
No improvement in validation loss. 7 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:58<00:00,  1.58it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.44it/s]
[2024-05-05 09:10:33.899594] Completed epoch 15 with training loss 0.43653107, validation loss 0.55104840
No improvement in validation loss. 8 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.60it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.45it/s]
[2024-05-05 09:11:38.041616] Completed epoch 16 with training loss 0.43524212, validation loss 0.55447280
No improvement in validation loss. 9 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.60it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.45it/s]
[2024-05-05 09:12:42.273004] Completed epoch 17 with training loss 0.43662414, validation loss 0.54886353
No improvement in validation loss. 10 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.60it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.41it/s]
[2024-05-05 09:13:46.540041] Completed epoch 18 with training loss 0.43446910, validation loss 0.54104722
No improvement in validation loss. 11 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.59it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.44it/s]
[2024-05-05 09:14:50.904961] Completed epoch 19 with training loss 0.43210497, validation loss 0.54118586
No improvement in validation loss. 12 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.59it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.43it/s]
[2024-05-05 09:15:55.398865] Completed epoch 20 with training loss 0.43563357, validation loss 0.57028401
No improvement in validation loss. 13 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.59it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.45it/s]
[2024-05-05 09:16:59.728632] Completed epoch 21 with training loss 0.43004209, validation loss 0.51671064
Validation loss improved to 0.51671064. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.59it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.44it/s]
[2024-05-05 09:18:04.164520] Completed epoch 22 with training loss 0.43434691, validation loss 0.54195237
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.60it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.45it/s]
[2024-05-05 09:19:08.469212] Completed epoch 23 with training loss 0.43328330, validation loss 0.49948326
Validation loss improved to 0.49948326. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.59it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.45it/s]
[2024-05-05 09:20:12.874554] Completed epoch 24 with training loss 0.42860928, validation loss 0.56357300
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.60it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.46it/s]
[2024-05-05 09:21:17.081758] Completed epoch 25 with training loss 0.43032983, validation loss 0.48440278
Validation loss improved to 0.48440278. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.60it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.45it/s]
[2024-05-05 09:22:21.452873] Completed epoch 26 with training loss 0.43113944, validation loss 0.55373621
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.60it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.44it/s]
[2024-05-05 09:23:25.642437] Completed epoch 27 with training loss 0.43000260, validation loss 0.50376236
No improvement in validation loss. 2 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.60it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.43it/s]
[2024-05-05 09:24:29.968436] Completed epoch 28 with training loss 0.42806745, validation loss 0.53264713
No improvement in validation loss. 3 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.59it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.41it/s]
[2024-05-05 09:25:34.487631] Completed epoch 29 with training loss 0.43031469, validation loss 0.49229011
No improvement in validation loss. 4 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.59it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
[2024-05-05 09:26:38.932488] Completed epoch 30 with training loss 0.42944285, validation loss 0.48117244
Validation loss improved to 0.48117244. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.61it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.42it/s]
[2024-05-05 09:27:42.857992] Completed epoch 31 with training loss 0.43034256, validation loss 0.50321847
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.61it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
[2024-05-05 09:28:46.637962] Completed epoch 32 with training loss 0.42835775, validation loss 0.49361616
No improvement in validation loss. 2 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.61it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.46it/s]
[2024-05-05 09:29:50.440438] Completed epoch 33 with training loss 0.42723531, validation loss 0.45142704
Validation loss improved to 0.45142704. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.61it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
[2024-05-05 09:30:54.272482] Completed epoch 34 with training loss 0.42929479, validation loss 0.47940233
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.61it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
[2024-05-05 09:31:58.088229] Completed epoch 35 with training loss 0.42885858, validation loss 0.52129912
No improvement in validation loss. 2 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.61it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
[2024-05-05 09:33:01.804689] Completed epoch 36 with training loss 0.42931083, validation loss 0.46083814
No improvement in validation loss. 3 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.61it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.49it/s]
[2024-05-05 09:34:05.646534] Completed epoch 37 with training loss 0.42975935, validation loss 0.53557467
No improvement in validation loss. 4 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.61it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
[2024-05-05 09:35:09.438138] Completed epoch 38 with training loss 0.42629188, validation loss 0.47801813
No improvement in validation loss. 5 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.61it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
[2024-05-05 09:36:13.321031] Completed epoch 39 with training loss 0.42761028, validation loss 0.54363155
No improvement in validation loss. 6 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.61it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
[2024-05-05 09:37:17.070160] Completed epoch 40 with training loss 0.42911997, validation loss 0.48531759
No improvement in validation loss. 7 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.61it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
[2024-05-05 09:38:20.855190] Completed epoch 41 with training loss 0.42998537, validation loss 0.47967345
No improvement in validation loss. 8 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.61it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
[2024-05-05 09:39:24.636106] Completed epoch 42 with training loss 0.42636839, validation loss 0.48681450
No improvement in validation loss. 9 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.61it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
[2024-05-05 09:40:28.444615] Completed epoch 43 with training loss 0.42858717, validation loss 0.47652349
No improvement in validation loss. 10 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:59<00:00,  1.55it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.45it/s]
[2024-05-05 09:41:34.307254] Completed epoch 44 with training loss 0.42514598, validation loss 0.43046546
Validation loss improved to 0.43046546. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.60it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
[2024-05-05 09:42:38.300785] Completed epoch 45 with training loss 0.42698190, validation loss 0.44236922
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.61it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.45it/s]
[2024-05-05 09:43:42.042260] Completed epoch 46 with training loss 0.42572764, validation loss 0.47541523
No improvement in validation loss. 2 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.61it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
[2024-05-05 09:44:45.765619] Completed epoch 47 with training loss 0.42325920, validation loss 0.51030809
No improvement in validation loss. 3 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.60it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.46it/s]
[2024-05-05 09:45:49.725396] Completed epoch 48 with training loss 0.42716146, validation loss 0.56866592
No improvement in validation loss. 4 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.61it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
[2024-05-05 09:46:53.400656] Completed epoch 49 with training loss 0.42623141, validation loss 0.43467841
No improvement in validation loss. 5 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:58<00:00,  1.58it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
[2024-05-05 09:47:58.232551] Completed epoch 50 with training loss 0.42436939, validation loss 0.45508179
No improvement in validation loss. 6 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.61it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
[2024-05-05 09:49:01.798796] Completed epoch 51 with training loss 0.42736930, validation loss 0.48410159
No improvement in validation loss. 7 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.61it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.44it/s]
[2024-05-05 09:50:05.472364] Completed epoch 52 with training loss 0.42675933, validation loss 0.44829038
No improvement in validation loss. 8 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.61it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
[2024-05-05 09:51:09.006272] Completed epoch 53 with training loss 0.42609468, validation loss 0.49377853
No improvement in validation loss. 9 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.61it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
[2024-05-05 09:52:12.631644] Completed epoch 54 with training loss 0.42855832, validation loss 0.44673514
No improvement in validation loss. 10 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.61it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
[2024-05-05 09:53:16.164116] Completed epoch 55 with training loss 0.42478073, validation loss 0.47344184
No improvement in validation loss. 11 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.61it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
[2024-05-05 09:54:19.762134] Completed epoch 56 with training loss 0.42429280, validation loss 0.49600908
No improvement in validation loss. 12 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.62it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
[2024-05-05 09:55:23.220362] Completed epoch 57 with training loss 0.42566758, validation loss 0.45086876
No improvement in validation loss. 13 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.61it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
[2024-05-05 09:56:26.818279] Completed epoch 58 with training loss 0.42679152, validation loss 0.43022159
Validation loss improved to 0.43022159. Model saved.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.61it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
[2024-05-05 09:57:30.434482] Completed epoch 59 with training loss 0.42536658, validation loss 0.47302449
No improvement in validation loss. 1 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.61it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
[2024-05-05 09:58:34.026053] Completed epoch 60 with training loss 0.42489013, validation loss 0.49783415
No improvement in validation loss. 2 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.62it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
[2024-05-05 09:59:37.524017] Completed epoch 61 with training loss 0.42652848, validation loss 0.44099951
No improvement in validation loss. 3 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.61it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
[2024-05-05 10:00:41.164270] Completed epoch 62 with training loss 0.42291155, validation loss 0.48090482
No improvement in validation loss. 4 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.62it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.45it/s]
[2024-05-05 10:01:44.679514] Completed epoch 63 with training loss 0.42935926, validation loss 0.45246100
No improvement in validation loss. 5 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.62it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.48it/s]
[2024-05-05 10:02:48.163189] Completed epoch 64 with training loss 0.42713469, validation loss 0.45998120
No improvement in validation loss. 6 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.61it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
[2024-05-05 10:03:51.778188] Completed epoch 65 with training loss 0.42691356, validation loss 0.45400676
No improvement in validation loss. 7 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:56<00:00,  1.61it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.49it/s]
[2024-05-05 10:04:55.309981] Completed epoch 66 with training loss 0.42396745, validation loss 0.48689330
No improvement in validation loss. 8 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.61it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
[2024-05-05 10:05:58.958959] Completed epoch 67 with training loss 0.42615947, validation loss 0.49860451
No improvement in validation loss. 9 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:58<00:00,  1.57it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.38it/s]
[2024-05-05 10:07:04.481587] Completed epoch 68 with training loss 0.42473927, validation loss 0.44489887
No improvement in validation loss. 10 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:59<00:00,  1.55it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.40it/s]
[2024-05-05 10:08:10.809342] Completed epoch 69 with training loss 0.42811003, validation loss 0.45645335
No improvement in validation loss. 11 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:58<00:00,  1.58it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.45it/s]
[2024-05-05 10:09:15.733261] Completed epoch 70 with training loss 0.42478997, validation loss 0.43954751
No improvement in validation loss. 12 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.60it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.46it/s]
[2024-05-05 10:10:19.729233] Completed epoch 71 with training loss 0.42583930, validation loss 0.49448416
No improvement in validation loss. 13 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.61it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.47it/s]
[2024-05-05 10:11:23.342037] Completed epoch 72 with training loss 0.42417717, validation loss 0.50362325
No improvement in validation loss. 14 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.61it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.49it/s]
[2024-05-05 10:12:26.983294] Completed epoch 73 with training loss 0.42300090, validation loss 0.43971765
No improvement in validation loss. 15 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.61it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.44it/s]
[2024-05-05 10:13:30.889860] Completed epoch 74 with training loss 0.42711711, validation loss 0.48682278
No improvement in validation loss. 16 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.60it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.42it/s]
[2024-05-05 10:14:35.231602] Completed epoch 75 with training loss 0.42364797, validation loss 0.52352500
No improvement in validation loss. 17 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.60it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.42it/s]
[2024-05-05 10:15:39.602726] Completed epoch 76 with training loss 0.42682052, validation loss 0.44195628
No improvement in validation loss. 18 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.60it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.45it/s]
[2024-05-05 10:16:43.855044] Completed epoch 77 with training loss 0.42513561, validation loss 0.43847373
No improvement in validation loss. 19 epochs without improvement.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 92/92 [00:57<00:00,  1.60it/s]
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.41it/s]
[2024-05-05 10:17:48.003298] Completed epoch 78 with training loss 0.42320386, validation loss 0.44159371
No improvement in validation loss. 20 epochs without improvement.
Early stopping due to no improvement in validation loss.

Plot Validation and Loss Values from Training
  Epoch with best Validation Loss:   58, 0.4302
Generate AUROC/AUPRC for Each Intermediate Model

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0000.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:05<00:00,  2.72it/s]
Loss: 0.5766444969922304
AUROC: 0.8367550892648379
AUPRC: 0.7101943795746067
Sensitivity: 0.7754010695187166
Specificity: 0.7489421720733427
Threshold: 0.15
Accuracy:  0.7564426478019202

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.38it/s]
Loss: 0.533894677956899
AUROC: 0.8330412986963412
AUPRC: 0.6684195758859306
Sensitivity: 0.7457969065232011
Specificity: 0.7784402575721441
Threshold: 0.16
Accuracy:  0.7698943661971831

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0001.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.39it/s]
Loss: 0.5814922414720058
AUROC: 0.840667104128483
AUPRC: 0.7224776163238393
Sensitivity: 0.7807486631016043
Specificity: 0.7447108603667136
Threshold: 0.14
Accuracy:  0.7549267306720566

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.40it/s]
Loss: 0.5390793303648631
AUROC: 0.8367653618104662
AUPRC: 0.6770979771811446
Sensitivity: 0.7505043712172159
Specificity: 0.7805866921058908
Threshold: 0.15
Accuracy:  0.7727112676056338

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0002.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.38it/s]
Loss: 0.7320823334157467
AUROC: 0.8405891655290145
AUPRC: 0.7281953231353923
Sensitivity: 0.7843137254901961
Specificity: 0.736953455571227
Threshold: 0.06
Accuracy:  0.7503789792824659

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.36it/s]
Loss: 0.6705856237146589
AUROC: 0.8349149180808761
AUPRC: 0.6773409311810091
Sensitivity: 0.7256220578345662
Specificity: 0.7977581683758646
Threshold: 0.07
Accuracy:  0.7788732394366197

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0003.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.35it/s]
Loss: 0.6721781529486179
AUROC: 0.8422258761178533
AUPRC: 0.7289308637815999
Sensitivity: 0.7664884135472371
Specificity: 0.7588152327221439
Threshold: 0.08
Accuracy:  0.7609903991915109

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.34it/s]
Loss: 0.6164364669058058
AUROC: 0.8369750654010567
AUPRC: 0.6790415754545044
Sensitivity: 0.7841291190316073
Specificity: 0.7409968995945624
Threshold: 0.08
Accuracy:  0.7522887323943662

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0004.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.35it/s]
Loss: 0.5470827147364616
AUROC: 0.8429172669195899
AUPRC: 0.7268563990060488
Sensitivity: 0.7825311942959001
Specificity: 0.736953455571227
Threshold: 0.16
Accuracy:  0.7498736735725113

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.33it/s]
Loss: 0.5127649267514547
AUROC: 0.8378735430412009
AUPRC: 0.6791788301870275
Sensitivity: 0.7659717552118359
Specificity: 0.7672310994514667
Threshold: 0.17
Accuracy:  0.7669014084507042

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0005.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.34it/s]
Loss: 0.5580893252044916
AUROC: 0.8429241808276074
AUPRC: 0.727383374309169
Sensitivity: 0.7664884135472371
Specificity: 0.7637517630465445
Threshold: 0.15
Accuracy:  0.7645275391611925

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.35it/s]
Loss: 0.5251552290386624
AUROC: 0.8378113136009337
AUPRC: 0.6794132910577741
Sensitivity: 0.7834566240753195
Specificity: 0.7419508704984498
Threshold: 0.15
Accuracy:  0.7528169014084507

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0006.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.39it/s]
Loss: 0.5612320359796286
AUROC: 0.8427714463141326
AUPRC: 0.7262298514795854
Sensitivity: 0.7593582887700535
Specificity: 0.7750352609308886
Threshold: 0.15
Accuracy:  0.7705912076806468

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.34it/s]
Loss: 0.5168395270903905
AUROC: 0.8374516498901122
AUPRC: 0.6790551176980775
Sensitivity: 0.7740416946872899
Specificity: 0.7536370140710709
Threshold: 0.15
Accuracy:  0.7589788732394366

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0007.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.37it/s]
Loss: 0.5394308101385832
AUROC: 0.8428468707652315
AUPRC: 0.7269913795820249
Sensitivity: 0.7736185383244206
Specificity: 0.7475317348377997
Threshold: 0.17
Accuracy:  0.7549267306720566

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.37it/s]
Loss: 0.5017250643836128
AUROC: 0.8373586265000222
AUPRC: 0.6787198579307457
Sensitivity: 0.7632817753866846
Specificity: 0.7691390412592416
Threshold: 0.18
Accuracy:  0.7676056338028169

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0008.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.36it/s]
Loss: 0.5864896159619093
AUROC: 0.8417469308533775
AUPRC: 0.7268732577314511
Sensitivity: 0.7450980392156863
Specificity: 0.7905500705218618
Threshold: 0.13
Accuracy:  0.7776654876200101

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.33it/s]
Loss: 0.5402133193280961
AUROC: 0.8360217360377907
AUPRC: 0.6782421495176464
Sensitivity: 0.7612642905178211
Specificity: 0.7648461721917481
Threshold: 0.13
Accuracy:  0.7639084507042253

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0009.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.30it/s]
Loss: 0.5454101264476776
AUROC: 0.841661449808799
AUPRC: 0.7252933340544084
Sensitivity: 0.7593582887700535
Specificity: 0.7729196050775741
Threshold: 0.16
Accuracy:  0.7690752905507833

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.38it/s]
Loss: 0.5124820934401618
AUROC: 0.8360362508943476
AUPRC: 0.6781298188698383
Sensitivity: 0.7807666442501682
Specificity: 0.7426663486763654
Threshold: 0.16
Accuracy:  0.7526408450704225

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0010.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.38it/s]
Loss: 0.578176686540246
AUROC: 0.8410775388498777
AUPRC: 0.7266978019296765
Sensitivity: 0.7629233511586453
Specificity: 0.7679830747531735
Threshold: 0.13
Accuracy:  0.7665487620010106

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.35it/s]
Loss: 0.5327721370591058
AUROC: 0.83522542374159
AUPRC: 0.6770430542069084
Sensitivity: 0.7397444519166106
Specificity: 0.7839255902694968
Threshold: 0.14
Accuracy:  0.7723591549295775

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0011.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.40it/s]
Loss: 0.5627537481486797
AUROC: 0.8405388825616154
AUPRC: 0.7238320049316976
Sensitivity: 0.7450980392156863
Specificity: 0.7877291960507757
Threshold: 0.15
Accuracy:  0.775644264780192

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.37it/s]
Loss: 0.5211466736263699
AUROC: 0.8346899779005293
AUPRC: 0.6767897178182914
Sensitivity: 0.7639542703429725
Specificity: 0.7562604340567612
Threshold: 0.15
Accuracy:  0.7582746478873239

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0012.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.34it/s]
Loss: 0.6056887991726398
AUROC: 0.8402510125732561
AUPRC: 0.7266692725742069
Sensitivity: 0.7771836007130125
Specificity: 0.7390691114245416
Threshold: 0.1
Accuracy:  0.7498736735725113

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.36it/s]
Loss: 0.5602250105804867
AUROC: 0.8340851654797898
AUPRC: 0.6760047476205322
Sensitivity: 0.7484868863483524
Specificity: 0.7710469830670165
Threshold: 0.11
Accuracy:  0.7651408450704226

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0013.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.33it/s]
Loss: 0.5739219635725021
AUROC: 0.8403666633982738
AUPRC: 0.7266959012308821
Sensitivity: 0.7486631016042781
Specificity: 0.7884344146685472
Threshold: 0.13
Accuracy:  0.7771601819100555

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.39it/s]
Loss: 0.5318174895313051
AUROC: 0.834215799188804
AUPRC: 0.6755843008307322
Sensitivity: 0.7639542703429725
Specificity: 0.7553064631528739
Threshold: 0.13
Accuracy:  0.7575704225352112

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0014.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.41it/s]
Loss: 0.5618194509297609
AUROC: 0.8400335387392552
AUPRC: 0.7256975224877991
Sensitivity: 0.7736185383244206
Specificity: 0.7461212976022567
Threshold: 0.14
Accuracy:  0.7539161192521475

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.38it/s]
Loss: 0.5208871728844113
AUROC: 0.8340145960114458
AUPRC: 0.6748714814226333
Sensitivity: 0.7545393409549428
Specificity: 0.7662771285475793
Threshold: 0.15
Accuracy:  0.7632042253521126

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0015.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.38it/s]
Loss: 0.5488193947821856
AUROC: 0.8404798000749216
AUPRC: 0.7273453369027281
Sensitivity: 0.7468805704099821
Specificity: 0.7891396332863188
Threshold: 0.15
Accuracy:  0.7771601819100555

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.36it/s]
Loss: 0.5114674465523825
AUROC: 0.8342695282158386
AUPRC: 0.6746846549563309
Sensitivity: 0.7626092804303968
Specificity: 0.7557834486048175
Threshold: 0.15
Accuracy:  0.7575704225352112

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0016.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.34it/s]
Loss: 0.5495932940393686
AUROC: 0.8403452931371292
AUPRC: 0.726918204036638
Sensitivity: 0.7718360071301248
Specificity: 0.7552891396332864
Threshold: 0.14
Accuracy:  0.7599797877716018

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.35it/s]
Loss: 0.5108841531806522
AUROC: 0.8342300734676281
AUPRC: 0.6750422661074212
Sensitivity: 0.7558843308675185
Specificity: 0.7615072740281421
Threshold: 0.15
Accuracy:  0.7600352112676056

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0017.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.38it/s]
Loss: 0.5501173213124275
AUROC: 0.8410756532386003
AUPRC: 0.7281396410355998
Sensitivity: 0.7522281639928698
Specificity: 0.7863187588152327
Threshold: 0.15
Accuracy:  0.7766548762001011

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.31it/s]
Loss: 0.5133267876174715
AUROC: 0.834695831958699
AUPRC: 0.6750561609289495
Sensitivity: 0.7760591795561533
Specificity: 0.7471977104698306
Threshold: 0.15
Accuracy:  0.7547535211267605

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0018.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.35it/s]
Loss: 0.5378205738961697
AUROC: 0.8421567370376796
AUPRC: 0.727831378342811
Sensitivity: 0.7629233511586453
Specificity: 0.7729196050775741
Threshold: 0.15
Accuracy:  0.7700859019706923

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.34it/s]
Loss: 0.5007320814662509
AUROC: 0.8366456342920142
AUPRC: 0.6768067636932198
Sensitivity: 0.7525218560860794
Specificity: 0.7708084903410446
Threshold: 0.16
Accuracy:  0.7660211267605633

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0019.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.38it/s]
Loss: 0.5468268692493439
AUROC: 0.8420385720642918
AUPRC: 0.7297545216205704
Sensitivity: 0.7540106951871658
Specificity: 0.7813822284908322
Threshold: 0.14
Accuracy:  0.7736230419403739

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.35it/s]
Loss: 0.5045146081182692
AUROC: 0.8355562983170304
AUPRC: 0.6754201256567497
Sensitivity: 0.773369199731002
Specificity: 0.74839017409969
Threshold: 0.14
Accuracy:  0.7549295774647887

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0020.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.36it/s]
Loss: 0.570230670273304
AUROC: 0.8421906780406739
AUPRC: 0.7282481066036343
Sensitivity: 0.7664884135472371
Specificity: 0.7616361071932299
Threshold: 0.12
Accuracy:  0.763011622031329

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.36it/s]
Loss: 0.5257135255469216
AUROC: 0.8362852488479936
AUPRC: 0.6758085824915133
Sensitivity: 0.7646267652992602
Specificity: 0.760076317672311
Threshold: 0.13
Accuracy:  0.7612676056338028

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0021.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.38it/s]
Loss: 0.513946320861578
AUROC: 0.8419254353876441
AUPRC: 0.727552871737383
Sensitivity: 0.7736185383244206
Specificity: 0.7503526093088858
Threshold: 0.16
Accuracy:  0.7569479535118747

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.39it/s]
Loss: 0.47993041906091904
AUROC: 0.8350873321228531
AUPRC: 0.67504823522419
Sensitivity: 0.7478143913920645
Specificity: 0.7648461721917481
Threshold: 0.18
Accuracy:  0.7603873239436619

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0022.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.30it/s]
Loss: 0.5432496294379234
AUROC: 0.8433082169911175
AUPRC: 0.7288616292055511
Sensitivity: 0.7664884135472371
Specificity: 0.7623413258110014
Threshold: 0.14
Accuracy:  0.7635169277412834

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.33it/s]
Loss: 0.5022091408570607
AUROC: 0.8368241429698936
AUPRC: 0.6756840753470696
Sensitivity: 0.7666442501681238
Specificity: 0.7548294777009301
Threshold: 0.15
Accuracy:  0.7579225352112676

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0023.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.38it/s]
Loss: 0.48990533500909805
AUROC: 0.8436576836145409
AUPRC: 0.7301238344546647
Sensitivity: 0.768270944741533
Specificity: 0.7566995768688294
Threshold: 0.18
Accuracy:  0.7599797877716018

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.34it/s]
Loss: 0.46235516866048176
AUROC: 0.836823822199583
AUPRC: 0.6756662977195553
Sensitivity: 0.7511768661735037
Specificity: 0.76508466491772
Threshold: 0.2
Accuracy:  0.761443661971831

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0024.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.33it/s]
Loss: 0.5611026957631111
AUROC: 0.8426872223437394
AUPRC: 0.7278513748828791
Sensitivity: 0.7664884135472371
Specificity: 0.7757404795486601
Threshold: 0.1
Accuracy:  0.7731177362304194

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.36it/s]
Loss: 0.5208178606298235
AUROC: 0.8367967171083326
AUPRC: 0.6760467492973734
Sensitivity: 0.7565568258238063
Specificity: 0.7653231576436919
Threshold: 0.11
Accuracy:  0.7630281690140845

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0025.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.39it/s]
Loss: 0.47930797934532166
AUROC: 0.844267364594254
AUPRC: 0.72960930027543
Sensitivity: 0.7557932263814616
Specificity: 0.7743300423131171
Threshold: 0.22
Accuracy:  0.7690752905507833

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.37it/s]
Loss: 0.45238074527846445
AUROC: 0.838542188753761
AUPRC: 0.6772182407519968
Sensitivity: 0.753866845998655
Specificity: 0.7712854757929883
Threshold: 0.23
Accuracy:  0.766725352112676

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0026.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.37it/s]
Loss: 0.5505655463784933
AUROC: 0.8436451128726911
AUPRC: 0.728500288128981
Sensitivity: 0.7700534759358288
Specificity: 0.7623413258110014
Threshold: 0.11
Accuracy:  0.7645275391611925

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.30it/s]
Loss: 0.5096750885248185
AUROC: 0.8375038552581712
AUPRC: 0.6762107802080186
Sensitivity: 0.753866845998655
Specificity: 0.7722394466968757
Threshold: 0.13
Accuracy:  0.7674295774647887

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0027.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.34it/s]
Loss: 0.5005566291511059
AUROC: 0.8446897415204062
AUPRC: 0.7308294445946545
Sensitivity: 0.7557932263814616
Specificity: 0.7736248236953456
Threshold: 0.18
Accuracy:  0.7685699848408287

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.37it/s]
Loss: 0.4698571261432436
AUROC: 0.8410575091447606
AUPRC: 0.6785410963910624
Sensitivity: 0.7491593813046402
Specificity: 0.7746243739565943
Threshold: 0.19
Accuracy:  0.7679577464788733

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0028.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.35it/s]
Loss: 0.5343917086720467
AUROC: 0.8461127494977989
AUPRC: 0.7302571028008937
Sensitivity: 0.7629233511586453
Specificity: 0.7722143864598026
Threshold: 0.14
Accuracy:  0.7695805962607377

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.31it/s]
Loss: 0.48741625944773354
AUROC: 0.8434073120554625
AUPRC: 0.679511490487734
Sensitivity: 0.7652992602555481
Specificity: 0.7669926067254949
Threshold: 0.15
Accuracy:  0.7665492957746479

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0029.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.31it/s]
Loss: 0.49095501378178596
AUROC: 0.8447136259299206
AUPRC: 0.7304540112097425
Sensitivity: 0.7647058823529411
Specificity: 0.7602256699576869
Threshold: 0.18
Accuracy:  0.7614957049014653

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.33it/s]
Loss: 0.46152065363195205
AUROC: 0.8397412281749886
AUPRC: 0.6774177424691264
Sensitivity: 0.7679892400806994
Specificity: 0.7524445504412115
Threshold: 0.19
Accuracy:  0.7565140845070423

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0030.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.31it/s]
Loss: 0.4844560530036688
AUROC: 0.8458261365836243
AUPRC: 0.7316964395666118
Sensitivity: 0.7664884135472371
Specificity: 0.7637517630465445
Threshold: 0.2
Accuracy:  0.7645275391611925

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.26it/s]
Loss: 0.4504017667637931
AUROC: 0.8423609593021064
AUPRC: 0.679274743223105
Sensitivity: 0.7679892400806994
Specificity: 0.7619842594800859
Threshold: 0.21
Accuracy:  0.763556338028169

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0031.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.30it/s]
Loss: 0.5158301200717688
AUROC: 0.845215198529726
AUPRC: 0.7289969541548302
Sensitivity: 0.7664884135472371
Specificity: 0.764456981664316
Threshold: 0.14
Accuracy:  0.7650328448711471

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.35it/s]
Loss: 0.47409283618132275
AUROC: 0.841270821401346
AUPRC: 0.6795848091500061
Sensitivity: 0.769334229993275
Specificity: 0.758406868590508
Threshold: 0.15
Accuracy:  0.7612676056338028

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0032.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.34it/s]
Loss: 0.4939976017922163
AUROC: 0.8442384518879997
AUPRC: 0.7315437052405469
Sensitivity: 0.7575757575757576
Specificity: 0.7722143864598026
Threshold: 0.2
Accuracy:  0.7680646791308742

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.35it/s]
Loss: 0.4589846409029431
AUROC: 0.8402398656229015
AUPRC: 0.6778164049483032
Sensitivity: 0.7525218560860794
Specificity: 0.7743858812306225
Threshold: 0.21
Accuracy:  0.768661971830986

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0033.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.29it/s]
Loss: 0.4537293631583452
AUROC: 0.844639458553007
AUPRC: 0.7263947893480668
Sensitivity: 0.7771836007130125
Specificity: 0.7545839210155149
Threshold: 0.25
Accuracy:  0.7609903991915109

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.31it/s]
Loss: 0.4342841810650296
AUROC: 0.8429182977168691
AUPRC: 0.6809561944315297
Sensitivity: 0.7753866845998655
Specificity: 0.7605533031242547
Threshold: 0.26
Accuracy:  0.7644366197183099

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0034.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.36it/s]
Loss: 0.47647335194051266
AUROC: 0.8471787484066584
AUPRC: 0.7328878790549735
Sensitivity: 0.768270944741533
Specificity: 0.7609308885754584
Threshold: 0.19
Accuracy:  0.763011622031329

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.35it/s]
Loss: 0.44553095532788173
AUROC: 0.8449462878134066
AUPRC: 0.682928566563497
Sensitivity: 0.7632817753866846
Specificity: 0.7674695921774386
Threshold: 0.2
Accuracy:  0.7663732394366197

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0035.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.32it/s]
Loss: 0.5163695812225342
AUROC: 0.8465678103527601
AUPRC: 0.7273342731320159
Sensitivity: 0.7629233511586453
Specificity: 0.7708039492242595
Threshold: 0.13
Accuracy:  0.7685699848408287

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.33it/s]
Loss: 0.47582270867294735
AUROC: 0.8431476484889874
AUPRC: 0.6802317034651852
Sensitivity: 0.773369199731002
Specificity: 0.7658001430956356
Threshold: 0.14
Accuracy:  0.7677816901408451

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0036.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.31it/s]
Loss: 0.45783159509301186
AUROC: 0.8442925060779537
AUPRC: 0.725934533688875
Sensitivity: 0.7629233511586453
Specificity: 0.7736248236953456
Threshold: 0.23
Accuracy:  0.7705912076806468

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.33it/s]
Loss: 0.4367400778664483
AUROC: 0.8429470868522504
AUPRC: 0.6801484919668899
Sensitivity: 0.753866845998655
Specificity: 0.780348199379919
Threshold: 0.24
Accuracy:  0.7734154929577465

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0037.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.35it/s]
Loss: 0.5442594848573208
AUROC: 0.8469600174984726
AUPRC: 0.7331337238053963
Sensitivity: 0.7611408199643493
Specificity: 0.7729196050775741
Threshold: 0.12
Accuracy:  0.7695805962607377

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.37it/s]
Loss: 0.49172576467196144
AUROC: 0.8428839752936292
AUPRC: 0.6810886946248302
Sensitivity: 0.7713517148621385
Specificity: 0.7579298831385642
Threshold: 0.13
Accuracy:  0.761443661971831

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0038.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.29it/s]
Loss: 0.4867308884859085
AUROC: 0.8461605183168279
AUPRC: 0.7319560780577755
Sensitivity: 0.7664884135472371
Specificity: 0.7588152327221439
Threshold: 0.18
Accuracy:  0.7609903991915109

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.33it/s]
Loss: 0.4483408444457584
AUROC: 0.8437896702657631
AUPRC: 0.6800824795468969
Sensitivity: 0.7679892400806994
Specificity: 0.7612687813021702
Threshold: 0.19
Accuracy:  0.7630281690140845

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0039.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.32it/s]
Loss: 0.5417625773698092
AUROC: 0.8484458791851143
AUPRC: 0.7346914930333022
Sensitivity: 0.7611408199643493
Specificity: 0.7722143864598026
Threshold: 0.1
Accuracy:  0.7690752905507833

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:20<00:00,  2.18it/s]
Loss: 0.4965846422645781
AUROC: 0.845687347423597
AUPRC: 0.6829703184680315
Sensitivity: 0.7747141896435776
Specificity: 0.7669926067254949
Threshold: 0.11
Accuracy:  0.7690140845070422

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0040.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.32it/s]
Loss: 0.4837268032133579
AUROC: 0.8447840220842793
AUPRC: 0.7255641641809404
Sensitivity: 0.7557932263814616
Specificity: 0.767277856135402
Threshold: 0.2
Accuracy:  0.764022233451238

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.31it/s]
Loss: 0.45013836953375075
AUROC: 0.8419670533606224
AUPRC: 0.6797200547136013
Sensitivity: 0.7599193006052455
Specificity: 0.7705699976150727
Threshold: 0.21
Accuracy:  0.7677816901408451

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0041.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:07<00:00,  2.18it/s]
Loss: 0.484563734382391
AUROC: 0.8446520292948567
AUPRC: 0.727003580243105
Sensitivity: 0.7647058823529411
Specificity: 0.7566995768688294
Threshold: 0.17
Accuracy:  0.7589691763516928

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.29it/s]
Loss: 0.44952016605271233
AUROC: 0.842858955209398
AUPRC: 0.679987217022347
Sensitivity: 0.7679892400806994
Specificity: 0.7624612449320296
Threshold: 0.18
Accuracy:  0.7639084507042253

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0042.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.30it/s]
Loss: 0.4807850308716297
AUROC: 0.8460385821208853
AUPRC: 0.7314099776774436
Sensitivity: 0.7611408199643493
Specificity: 0.7757404795486601
Threshold: 0.18
Accuracy:  0.7716018191005558

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.40it/s]
Loss: 0.45209976269139185
AUROC: 0.8423780403211487
AUPRC: 0.6783239013390578
Sensitivity: 0.7592468056489576
Specificity: 0.7665156212735511
Threshold: 0.19
Accuracy:  0.764612676056338

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0043.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.35it/s]
Loss: 0.4731606524437666
AUROC: 0.8460951504592091
AUPRC: 0.7290494009698256
Sensitivity: 0.7611408199643493
Specificity: 0.765867418899859
Threshold: 0.18
Accuracy:  0.7645275391611925

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.34it/s]
Loss: 0.44522544576062095
AUROC: 0.8431496533034288
AUPRC: 0.6793985663075173
Sensitivity: 0.7666442501681238
Specificity: 0.7648461721917481
Threshold: 0.19
Accuracy:  0.7653169014084507

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0044.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:07<00:00,  2.27it/s]
Loss: 0.42955601401627064
AUROC: 0.8461869168747125
AUPRC: 0.7286487922202542
Sensitivity: 0.7647058823529411
Specificity: 0.7729196050775741
Threshold: 0.23
Accuracy:  0.7705912076806468

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.31it/s]
Loss: 0.4171357065439224
AUROC: 0.8432706639031234
AUPRC: 0.6787577745113803
Sensitivity: 0.7646267652992602
Specificity: 0.7691390412592416
Threshold: 0.24
Accuracy:  0.7679577464788733

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0045.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.33it/s]
Loss: 0.44524556025862694
AUROC: 0.845638832530063
AUPRC: 0.7291672241625626
Sensitivity: 0.7664884135472371
Specificity: 0.7743300423131171
Threshold: 0.26
Accuracy:  0.7721071248105104

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.30it/s]
Loss: 0.42870609727170733
AUROC: 0.8425069097934543
AUPRC: 0.6789085189321309
Sensitivity: 0.7632817753866846
Specificity: 0.770093012163129
Threshold: 0.27
Accuracy:  0.7683098591549296

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0046.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.32it/s]
Loss: 0.4761878680437803
AUROC: 0.8458248795094394
AUPRC: 0.7282076744254891
Sensitivity: 0.7664884135472371
Specificity: 0.7623413258110014
Threshold: 0.24
Accuracy:  0.7635169277412834

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:20<00:00,  2.23it/s]
Loss: 0.44726861317952477
AUROC: 0.8433688196181841
AUPRC: 0.6795491577243611
Sensitivity: 0.7552118359112306
Specificity: 0.7736704030527068
Threshold: 0.25
Accuracy:  0.768838028169014

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0047.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:07<00:00,  2.25it/s]
Loss: 0.5130687523633242
AUROC: 0.8472202318547628
AUPRC: 0.7330073990828974
Sensitivity: 0.7557932263814616
Specificity: 0.771509167842031
Threshold: 0.15
Accuracy:  0.7670540677109652

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.37it/s]
Loss: 0.4744901054435306
AUROC: 0.8433888677625999
AUPRC: 0.6784414298804728
Sensitivity: 0.7652992602555481
Specificity: 0.7648461721917481
Threshold: 0.16
Accuracy:  0.7649647887323944

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0048.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.35it/s]
Loss: 0.5722354725003242
AUROC: 0.8475169013624171
AUPRC: 0.7385260161242525
Sensitivity: 0.7771836007130125
Specificity: 0.7531734837799718
Threshold: 0.09
Accuracy:  0.7599797877716018

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.34it/s]
Loss: 0.5252249264054828
AUROC: 0.8433273600555319
AUPRC: 0.6798447736084893
Sensitivity: 0.7760591795561533
Specificity: 0.7536370140710709
Threshold: 0.1
Accuracy:  0.7595070422535212

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0049.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.37it/s]
Loss: 0.43206287547945976
AUROC: 0.8471297225134444
AUPRC: 0.7317291577850588
Sensitivity: 0.7629233511586453
Specificity: 0.7693935119887165
Threshold: 0.22
Accuracy:  0.7675593734209196

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.35it/s]
Loss: 0.41740948557853697
AUROC: 0.8439393898082612
AUPRC: 0.6796463404418542
Sensitivity: 0.7659717552118359
Specificity: 0.7665156212735511
Threshold: 0.23
Accuracy:  0.7663732394366197

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0050.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.33it/s]
Loss: 0.4533588830381632
AUROC: 0.8471448074036642
AUPRC: 0.7294288998915434
Sensitivity: 0.7807486631016043
Specificity: 0.7510578279266573
Threshold: 0.2
Accuracy:  0.7594744820616472

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.36it/s]
Loss: 0.43118088377846614
AUROC: 0.8456137306373017
AUPRC: 0.6803325482973362
Sensitivity: 0.7747141896435776
Specificity: 0.7564989267827331
Threshold: 0.21
Accuracy:  0.7612676056338028

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0051.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.39it/s]
Loss: 0.48039026372134686
AUROC: 0.8464144473021932
AUPRC: 0.7312244704881891
Sensitivity: 0.768270944741533
Specificity: 0.770098730606488
Threshold: 0.16
Accuracy:  0.7695805962607377

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.30it/s]
Loss: 0.45051755176650154
AUROC: 0.8424842954865532
AUPRC: 0.6801098994881578
Sensitivity: 0.7686617350369872
Specificity: 0.7541139995230145
Threshold: 0.17
Accuracy:  0.7579225352112676

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0052.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.29it/s]
Loss: 0.44419755414128304
AUROC: 0.8460687519013248
AUPRC: 0.7289178692582838
Sensitivity: 0.786096256684492
Specificity: 0.7531734837799718
Threshold: 0.24
Accuracy:  0.7625063163213744

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.32it/s]
Loss: 0.429237875673506
AUROC: 0.8432244729783892
AUPRC: 0.679208390125435
Sensitivity: 0.769334229993275
Specificity: 0.7598378249463391
Threshold: 0.25
Accuracy:  0.7623239436619719

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0053.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.37it/s]
Loss: 0.48810929246246815
AUROC: 0.8463377657769097
AUPRC: 0.7331626884687634
Sensitivity: 0.7647058823529411
Specificity: 0.763046544428773
Threshold: 0.15
Accuracy:  0.7635169277412834

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.29it/s]
Loss: 0.4583648390240139
AUROC: 0.8438089164844023
AUPRC: 0.679662841583716
Sensitivity: 0.7572293207800942
Specificity: 0.7753398521345098
Threshold: 0.17
Accuracy:  0.7705985915492958

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0054.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.39it/s]
Loss: 0.44502188824117184
AUROC: 0.8460888650882843
AUPRC: 0.7322062578867999
Sensitivity: 0.7629233511586453
Specificity: 0.771509167842031
Threshold: 0.22
Accuracy:  0.7690752905507833

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.38it/s]
Loss: 0.42278542386160956
AUROC: 0.8431622435381223
AUPRC: 0.6811452347751495
Sensitivity: 0.7599193006052455
Specificity: 0.7643691867398045
Threshold: 0.23
Accuracy:  0.7632042253521126

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0055.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.36it/s]
Loss: 0.4715195298194885
AUROC: 0.847046755617236
AUPRC: 0.7288683798538274
Sensitivity: 0.7522281639928698
Specificity: 0.7764456981664316
Threshold: 0.23
Accuracy:  0.7695805962607377

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.30it/s]
Loss: 0.444439074728224
AUROC: 0.8454434817949215
AUPRC: 0.6799824561548798
Sensitivity: 0.7780766644250168
Specificity: 0.7512520868113522
Threshold: 0.23
Accuracy:  0.7582746478873239

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0056.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.36it/s]
Loss: 0.4974107537418604
AUROC: 0.8467098597356624
AUPRC: 0.7328102925237503
Sensitivity: 0.7664884135472371
Specificity: 0.771509167842031
Threshold: 0.16
Accuracy:  0.7700859019706923

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.36it/s]
Loss: 0.4620446231630113
AUROC: 0.841962402191118
AUPRC: 0.679637937800592
Sensitivity: 0.7679892400806994
Specificity: 0.7569759122346769
Threshold: 0.17
Accuracy:  0.7598591549295775

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0057.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.34it/s]
Loss: 0.4526430666446686
AUROC: 0.846813568355923
AUPRC: 0.7272465570224258
Sensitivity: 0.7647058823529411
Specificity: 0.7722143864598026
Threshold: 0.22
Accuracy:  0.7700859019706923

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.32it/s]
Loss: 0.4256614863872528
AUROC: 0.8441753163717478
AUPRC: 0.6805559232422138
Sensitivity: 0.7572293207800942
Specificity: 0.7672310994514667
Threshold: 0.23
Accuracy:  0.764612676056338

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0058.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.36it/s]
Loss: 0.42864670418202877
AUROC: 0.8455784929691841
AUPRC: 0.729217887865657
Sensitivity: 0.7629233511586453
Specificity: 0.7757404795486601
Threshold: 0.27
Accuracy:  0.7721071248105104

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.33it/s]
Loss: 0.4226233250564999
AUROC: 0.8442036243516631
AUPRC: 0.6805487161262807
Sensitivity: 0.7666442501681238
Specificity: 0.7669926067254949
Threshold: 0.28
Accuracy:  0.7669014084507042

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0059.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.37it/s]
Loss: 0.4700356274843216
AUROC: 0.8473038272880635
AUPRC: 0.7295680384559642
Sensitivity: 0.7700534759358288
Specificity: 0.7566995768688294
Threshold: 0.19
Accuracy:  0.7604850934815564

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.29it/s]
Loss: 0.43939194944169785
AUROC: 0.8454284857828985
AUPRC: 0.6810814053459775
Sensitivity: 0.773369199731002
Specificity: 0.7505366086334366
Threshold: 0.2
Accuracy:  0.7565140845070423

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0060.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.36it/s]
Loss: 0.504686588421464
AUROC: 0.8481567521225697
AUPRC: 0.7346350785703647
Sensitivity: 0.7718360071301248
Specificity: 0.7559943582510579
Threshold: 0.15
Accuracy:  0.7604850934815564

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.36it/s]
Loss: 0.46262561712000105
AUROC: 0.8460544690441413
AUPRC: 0.6797137417411754
Sensitivity: 0.7531943510423672
Specificity: 0.7741473885046506
Threshold: 0.17
Accuracy:  0.768661971830986

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0061.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.39it/s]
Loss: 0.4406052138656378
AUROC: 0.8488085450874798
AUPRC: 0.7308357291864057
Sensitivity: 0.7664884135472371
Specificity: 0.768688293370945
Threshold: 0.22
Accuracy:  0.7680646791308742

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.40it/s]
Loss: 0.4196978492869271
AUROC: 0.8463277653488193
AUPRC: 0.6824823217100485
Sensitivity: 0.7639542703429725
Specificity: 0.7660386358216075
Threshold: 0.23
Accuracy:  0.7654929577464789

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0062.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.42it/s]
Loss: 0.48094186931848526
AUROC: 0.8424697485097387
AUPRC: 0.7256688901949622
Sensitivity: 0.7522281639928698
Specificity: 0.7736248236953456
Threshold: 0.22
Accuracy:  0.7675593734209196

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.41it/s]
Loss: 0.4516061590777503
AUROC: 0.840559272659736
AUPRC: 0.6771638229300881
Sensitivity: 0.7747141896435776
Specificity: 0.7526830431671834
Threshold: 0.22
Accuracy:  0.7584507042253521

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0063.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.44it/s]
Loss: 0.45032707788050175
AUROC: 0.8476400946325446
AUPRC: 0.7291820739929913
Sensitivity: 0.7557932263814616
Specificity: 0.7743300423131171
Threshold: 0.23
Accuracy:  0.7690752905507833

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.42it/s]
Loss: 0.42744472026824953
AUROC: 0.8455045083465238
AUPRC: 0.6817558776651382
Sensitivity: 0.753866845998655
Specificity: 0.7762938230383973
Threshold: 0.24
Accuracy:  0.7704225352112676

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0064.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.44it/s]
Loss: 0.45798348262906075
AUROC: 0.8438475018164722
AUPRC: 0.7201473029632298
Sensitivity: 0.7664884135472371
Specificity: 0.7581100141043724
Threshold: 0.27
Accuracy:  0.7604850934815564

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.40it/s]
Loss: 0.4477596680323283
AUROC: 0.844573713097581
AUPRC: 0.6796863480742792
Sensitivity: 0.7700067249495629
Specificity: 0.7762938230383973
Threshold: 0.29
Accuracy:  0.7746478873239436

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0065.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.43it/s]
Loss: 0.4510997664183378
AUROC: 0.8464094190054531
AUPRC: 0.7287041463363186
Sensitivity: 0.7575757575757576
Specificity: 0.7813822284908322
Threshold: 0.24
Accuracy:  0.774633653360283

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.40it/s]
Loss: 0.4312119579977459
AUROC: 0.844754226589902
AUPRC: 0.6804912565116779
Sensitivity: 0.7753866845998655
Specificity: 0.7498211304555211
Threshold: 0.24
Accuracy:  0.7565140845070423

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0066.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.42it/s]
Loss: 0.4885311108082533
AUROC: 0.8451108613723731
AUPRC: 0.7286323659929049
Sensitivity: 0.7575757575757576
Specificity: 0.767277856135402
Threshold: 0.17
Accuracy:  0.7645275391611925

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.39it/s]
Loss: 0.4512812654177348
AUROC: 0.8417559864962115
AUPRC: 0.6784030801563402
Sensitivity: 0.7626092804303968
Specificity: 0.7586453613164799
Threshold: 0.18
Accuracy:  0.7596830985915493

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0067.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.43it/s]
Loss: 0.4913977514952421
AUROC: 0.8491548690254407
AUPRC: 0.737879256488645
Sensitivity: 0.7700534759358288
Specificity: 0.7574047954866009
Threshold: 0.12
Accuracy:  0.7609903991915109

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.41it/s]
Loss: 0.457186057832506
AUROC: 0.8462745174772506
AUPRC: 0.6827164064293321
Sensitivity: 0.7646267652992602
Specificity: 0.7686620558072979
Threshold: 0.14
Accuracy:  0.7676056338028169

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0068.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.40it/s]
Loss: 0.4479974452406168
AUROC: 0.8476187243714001
AUPRC: 0.729982000880754
Sensitivity: 0.7718360071301248
Specificity: 0.7538787023977433
Threshold: 0.23
Accuracy:  0.7589691763516928

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.33it/s]
Loss: 0.4237896647718218
AUROC: 0.8455877482421387
AUPRC: 0.6784671177426738
Sensitivity: 0.7713517148621385
Specificity: 0.7510135940853804
Threshold: 0.24
Accuracy:  0.7563380281690141

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0069.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.40it/s]
Loss: 0.4539283700287342
AUROC: 0.8496639840703559
AUPRC: 0.7334097947812054
Sensitivity: 0.7611408199643493
Specificity: 0.7722143864598026
Threshold: 0.19
Accuracy:  0.7690752905507833

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.40it/s]
Loss: 0.42740449706713357
AUROC: 0.8471380311535334
AUPRC: 0.6824481160902529
Sensitivity: 0.7652992602555481
Specificity: 0.7610302885761985
Threshold: 0.2
Accuracy:  0.7621478873239437

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0070.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.40it/s]
Loss: 0.4405379369854927
AUROC: 0.8473987363890292
AUPRC: 0.7299777398950047
Sensitivity: 0.7754010695187166
Specificity: 0.7552891396332864
Threshold: 0.22
Accuracy:  0.7609903991915109

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.34it/s]
Loss: 0.42051863736576506
AUROC: 0.8459175803140695
AUPRC: 0.6817808461621205
Sensitivity: 0.7700067249495629
Specificity: 0.7588838540424517
Threshold: 0.23
Accuracy:  0.7617957746478873

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0071.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.37it/s]
Loss: 0.4910184647887945
AUROC: 0.8503679456139425
AUPRC: 0.7361898199330382
Sensitivity: 0.7700534759358288
Specificity: 0.7588152327221439
Threshold: 0.13
Accuracy:  0.7620010106114199

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.37it/s]
Loss: 0.4542519125673506
AUROC: 0.8487413213587637
AUPRC: 0.685156686933125
Sensitivity: 0.769334229993275
Specificity: 0.7674695921774386
Threshold: 0.15
Accuracy:  0.7679577464788733

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0072.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.44it/s]
Loss: 0.5014353152364492
AUROC: 0.8475194155107869
AUPRC: 0.7349541984656299
Sensitivity: 0.7700534759358288
Specificity: 0.7588152327221439
Threshold: 0.14
Accuracy:  0.7620010106114199

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.34it/s]
Loss: 0.46740964584880407
AUROC: 0.8466555926063085
AUPRC: 0.6828910520057416
Sensitivity: 0.7821116341627438
Specificity: 0.7619842594800859
Threshold: 0.15
Accuracy:  0.7672535211267606

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0073.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.35it/s]
Loss: 0.44261107966303825
AUROC: 0.8414716316068676
AUPRC: 0.7267401298560369
Sensitivity: 0.768270944741533
Specificity: 0.770098730606488
Threshold: 0.26
Accuracy:  0.7695805962607377

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.39it/s]
Loss: 0.43074404266145494
AUROC: 0.8383994459655194
AUPRC: 0.6767591578197593
Sensitivity: 0.7511768661735037
Specificity: 0.7753398521345098
Threshold: 0.27
Accuracy:  0.7690140845070422

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0074.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.44it/s]
Loss: 0.4856063388288021
AUROC: 0.8486683813158549
AUPRC: 0.7363156565236157
Sensitivity: 0.7718360071301248
Specificity: 0.7602256699576869
Threshold: 0.15
Accuracy:  0.7635169277412834

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.39it/s]
Loss: 0.4546901477707757
AUROC: 0.843915572612695
AUPRC: 0.681707493974143
Sensitivity: 0.7639542703429725
Specificity: 0.7636537085618889
Threshold: 0.17
Accuracy:  0.7637323943661972

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0075.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.42it/s]
Loss: 0.5264334082603455
AUROC: 0.8517532413657859
AUPRC: 0.7398571910549884
Sensitivity: 0.7629233511586453
Specificity: 0.7785613540197461
Threshold: 0.13
Accuracy:  0.7741283476503285

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.39it/s]
Loss: 0.48060300449530285
AUROC: 0.8497795746617759
AUPRC: 0.6832739841934459
Sensitivity: 0.773369199731002
Specificity: 0.7653231576436919
Threshold: 0.14
Accuracy:  0.7674295774647887

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0076.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.40it/s]
Loss: 0.4422839377075434
AUROC: 0.8466394635813038
AUPRC: 0.7284423982377303
Sensitivity: 0.7664884135472371
Specificity: 0.7736248236953456
Threshold: 0.26
Accuracy:  0.7716018191005558

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.41it/s]
Loss: 0.4273363524013095
AUROC: 0.8457184621437306
AUPRC: 0.6809220501096582
Sensitivity: 0.7706792199058508
Specificity: 0.7660386358216075
Threshold: 0.27
Accuracy:  0.7672535211267606

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0077.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.43it/s]
Loss: 0.43405881337821484
AUROC: 0.8467236875516972
AUPRC: 0.7266327743538239
Sensitivity: 0.7593582887700535
Specificity: 0.7827926657263752
Threshold: 0.25
Accuracy:  0.7761495704901465

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.41it/s]
Loss: 0.4210082425011529
AUROC: 0.8453105225011552
AUPRC: 0.6794484875960832
Sensitivity: 0.777404169468729
Specificity: 0.7522060577152397
Threshold: 0.25
Accuracy:  0.7588028169014085

Intermediate Model:
  ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0078.model
AUROC/AUPRC on Validation Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:06<00:00,  2.41it/s]
Loss: 0.4399308357387781
AUROC: 0.8453635332835532
AUPRC: 0.7212746471102922
Sensitivity: 0.7700534759358288
Specificity: 0.767277856135402
Threshold: 0.27
Accuracy:  0.7680646791308742

AUROC/AUPRC on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.39it/s]
Loss: 0.4207736929257711
AUROC: 0.8448236733621587
AUPRC: 0.6796096508856339
Sensitivity: 0.7599193006052455
Specificity: 0.7715239685189602
Threshold: 0.28
Accuracy:  0.7684859154929577


Plot AUROC/AUPRC for Each Intermediate Model
  Epoch with best Validation Loss:      58, 0.4302
  Epoch with best model Test AUROC:     75, 0.8498
  Epoch with best model Test Accuracy:   2, 0.7789

AUROC/AUPRC Plots - Best Model Based on Validation Loss
  Epoch with best Validation Loss:   58, 0.4302
  Best Model Based on Validation Loss:
    ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0058.model

Generate Stats Based on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:18<00:00,  2.37it/s]
Loss: 0.4226233250564999
AUROC: 0.8442036243516631
AUPRC: 0.6805487161262807
Sensitivity: 0.7666442501681238
Specificity: 0.7669926067254949
Threshold: 0.28
Accuracy:  0.7669014084507042
best_model_val_test_auroc: 0.8442036243516631
best_model_val_test_auprc: 0.6805487161262807

AUROC/AUPRC Plots - Best Model Based on Model AUROC
  Epoch with best model Test AUROC:  75, 0.8498
  Best Model Based on Model AUROC:
    ./vitaldb_cache/models/ABP_EEG_ECG_12_RESIDUAL_BLOCKS_128_BATCH_SIZE_1e-04_LEARNING_RATE_1e-01_WEIGHT_DECAY_003_MINS__ALL_MAX_CASES_de2a8d0e_0075.model

Generate Stats Based on Test Data
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 45/45 [00:19<00:00,  2.37it/s]
Loss: 0.48060300449530285
AUROC: 0.8497795746617759
AUPRC: 0.6832739841934459
Sensitivity: 0.773369199731002
Specificity: 0.7653231576436919
Threshold: 0.14
Accuracy:  0.7674295774647887
best_model_auroc_test_auroc: 0.8497795746617759
best_model_auroc_test_auprc: 0.6832739841934459

Total Processing Time: 7234.2840 sec

ABP and Batch Size Splits¶

In [112]:
RUN_ME = False
DISPLAY_MODEL_PREDICTION=True
DISPLAY_MODEL_PREDICTION_FIRST_ONLY=True

if MULTI_RUN and RUN_ME:
    (model, best_model_val_loss, best_model_auroc, experimentName) = run_experiment(
        experimentNamePrefix=None, 
        useAbp=True, 
        useEeg=False, 
        useEcg=False,
        nResiduals=12, 
        skip_connection=False,
        batch_size=16,
        learning_rate=1e-4,
        weight_decay=0.0,
        balance_labels=False,
        #pos_weight=2.0,
        pos_weight=None,
        max_epochs=MAX_EPOCHS,
        patience=PATIENCE,
        device=device
    )
    
    if DISPLAY_MODEL_PREDICTION:
        for case_id_to_check in my_cases_of_interest_idx:
            preds = predictionsForModel(case_id_to_check, model, best_model_val_loss, device)
            printModelPrediction(case_id_to_check, positiveSegmentsMap, 
                            negativeSegmentsMap, iohEventsMap, cleanEventsMap, preds, experimentName)

            if DISPLAY_MODEL_PREDICTION_FIRST_ONLY:
                break
In [113]:
RUN_ME = False
DISPLAY_MODEL_PREDICTION=True
DISPLAY_MODEL_PREDICTION_FIRST_ONLY=True

if MULTI_RUN and RUN_ME:
    (model, best_model_val_loss, best_model_auroc, experimentName) = run_experiment(
        experimentNamePrefix=None, 
        useAbp=True, 
        useEeg=False, 
        useEcg=False,
        nResiduals=12, 
        skip_connection=False,
        batch_size=32,
        learning_rate=1e-4,
        weight_decay=0.0,
        balance_labels=False,
        #pos_weight=2.0,
        pos_weight=None,
        max_epochs=MAX_EPOCHS,
        patience=PATIENCE,
        device=device
    )
    
    if DISPLAY_MODEL_PREDICTION:
        for case_id_to_check in my_cases_of_interest_idx:
            preds = predictionsForModel(case_id_to_check, model, best_model_val_loss, device)
            printModelPrediction(case_id_to_check, positiveSegmentsMap, 
                            negativeSegmentsMap, iohEventsMap, cleanEventsMap, preds, experimentName)

            if DISPLAY_MODEL_PREDICTION_FIRST_ONLY:
                break
In [114]:
RUN_ME = False
DISPLAY_MODEL_PREDICTION=True
DISPLAY_MODEL_PREDICTION_FIRST_ONLY=True

if MULTI_RUN and RUN_ME:
    (model, best_model_val_loss, best_model_auroc, experimentName) = run_experiment(
        experimentNamePrefix=None, 
        useAbp=True, 
        useEeg=False, 
        useEcg=False,
        nResiduals=12, 
        skip_connection=False,
        batch_size=64,
        learning_rate=1e-4,
        weight_decay=0.0,
        balance_labels=False,
        #pos_weight=2.0,
        pos_weight=None,
        max_epochs=MAX_EPOCHS,
        patience=PATIENCE,
        device=device
    )
    
    if DISPLAY_MODEL_PREDICTION:
        for case_id_to_check in my_cases_of_interest_idx:
            preds = predictionsForModel(case_id_to_check, model, best_model_val_loss, device)
            printModelPrediction(case_id_to_check, positiveSegmentsMap, 
                            negativeSegmentsMap, iohEventsMap, cleanEventsMap, preds, experimentName)

            if DISPLAY_MODEL_PREDICTION_FIRST_ONLY:
                break
In [115]:
RUN_ME = False
DISPLAY_MODEL_PREDICTION=True
DISPLAY_MODEL_PREDICTION_FIRST_ONLY=True

if MULTI_RUN and RUN_ME:
    (model, best_model_val_loss, best_model_auroc, experimentName) = run_experiment(
        experimentNamePrefix=None, 
        useAbp=True, 
        useEeg=False, 
        useEcg=False,
        nResiduals=12, 
        skip_connection=False,
        batch_size=128,
        learning_rate=1e-4,
        weight_decay=0.0,
        balance_labels=False,
        #pos_weight=2.0,
        pos_weight=None,
        max_epochs=MAX_EPOCHS,
        patience=PATIENCE,
        device=device
    )
    
    if DISPLAY_MODEL_PREDICTION:
        for case_id_to_check in my_cases_of_interest_idx:
            preds = predictionsForModel(case_id_to_check, model, best_model_val_loss, device)
            printModelPrediction(case_id_to_check, positiveSegmentsMap, 
                            negativeSegmentsMap, iohEventsMap, cleanEventsMap, preds, experimentName)

            if DISPLAY_MODEL_PREDICTION_FIRST_ONLY:
                break

ABP and Learning Rate¶

In [116]:
RUN_ME = False
DISPLAY_MODEL_PREDICTION=True
DISPLAY_MODEL_PREDICTION_FIRST_ONLY=True

if MULTI_RUN and RUN_ME:
    (model, best_model_val_loss, best_model_auroc, experimentName) = run_experiment(
        experimentNamePrefix=None, 
        useAbp=True, 
        useEeg=False, 
        useEcg=False,
        nResiduals=12, 
        skip_connection=False,
        batch_size=128,
        learning_rate=1e-2,
        weight_decay=0.0,
        balance_labels=False,
        #pos_weight=2.0,
        pos_weight=None,
        max_epochs=MAX_EPOCHS,
        patience=PATIENCE,
        device=device
    )
    
    if DISPLAY_MODEL_PREDICTION:
        for case_id_to_check in my_cases_of_interest_idx:
            preds = predictionsForModel(case_id_to_check, model, best_model_val_loss, device)
            printModelPrediction(case_id_to_check, positiveSegmentsMap, 
                            negativeSegmentsMap, iohEventsMap, cleanEventsMap, preds, experimentName)

            if DISPLAY_MODEL_PREDICTION_FIRST_ONLY:
                break
In [117]:
RUN_ME = False
DISPLAY_MODEL_PREDICTION=True
DISPLAY_MODEL_PREDICTION_FIRST_ONLY=True

if MULTI_RUN and RUN_ME:
    (model, best_model_val_loss, best_model_auroc, experimentName) = run_experiment(
        experimentNamePrefix=None, 
        useAbp=True, 
        useEeg=False, 
        useEcg=False,
        nResiduals=12, 
        skip_connection=False,
        batch_size=128,
        learning_rate=1e-3,
        weight_decay=0.0,
        balance_labels=False,
        #pos_weight=2.0,
        pos_weight=None,
        max_epochs=MAX_EPOCHS,
        patience=PATIENCE,
        device=device
    )
    
    if DISPLAY_MODEL_PREDICTION:
        for case_id_to_check in my_cases_of_interest_idx:
            preds = predictionsForModel(case_id_to_check, model, best_model_val_loss, device)
            printModelPrediction(case_id_to_check, positiveSegmentsMap, 
                            negativeSegmentsMap, iohEventsMap, cleanEventsMap, preds, experimentName)

            if DISPLAY_MODEL_PREDICTION_FIRST_ONLY:
                break
In [118]:
RUN_ME = False
DISPLAY_MODEL_PREDICTION=True
DISPLAY_MODEL_PREDICTION_FIRST_ONLY=True

if MULTI_RUN and RUN_ME:
    (model, best_model_val_loss, best_model_auroc, experimentName) = run_experiment(
        experimentNamePrefix=None, 
        useAbp=True, 
        useEeg=False, 
        useEcg=False,
        nResiduals=12, 
        skip_connection=False,
        batch_size=128,
        learning_rate=1e-4,
        weight_decay=0.0,
        balance_labels=False,
        #pos_weight=2.0,
        pos_weight=None,
        max_epochs=MAX_EPOCHS,
        patience=PATIENCE,
        device=device
    )
    
    if DISPLAY_MODEL_PREDICTION:
        for case_id_to_check in my_cases_of_interest_idx:
            preds = predictionsForModel(case_id_to_check, model, best_model_val_loss, device)
            printModelPrediction(case_id_to_check, positiveSegmentsMap, 
                            negativeSegmentsMap, iohEventsMap, cleanEventsMap, preds, experimentName)

            if DISPLAY_MODEL_PREDICTION_FIRST_ONLY:
                break

Balance Labels¶

In [119]:
RUN_ME = False
DISPLAY_MODEL_PREDICTION=True
DISPLAY_MODEL_PREDICTION_FIRST_ONLY=True

if MULTI_RUN and RUN_ME:
    (model, best_model_val_loss, best_model_auroc, experimentName) = run_experiment(
        experimentNamePrefix=None, 
        useAbp=True, 
        useEeg=False, 
        useEcg=False,
        nResiduals=12, 
        skip_connection=False,
        batch_size=64,
        learning_rate=1e-4,
        weight_decay=0.0,
        balance_labels=True,
        #pos_weight=2.0,
        pos_weight=None,
        max_epochs=MAX_EPOCHS,
        patience=PATIENCE,
        device=device
    )
    
    if DISPLAY_MODEL_PREDICTION:
        for case_id_to_check in my_cases_of_interest_idx:
            preds = predictionsForModel(case_id_to_check, model, best_model_val_loss, device)
            printModelPrediction(case_id_to_check, positiveSegmentsMap, 
                            negativeSegmentsMap, iohEventsMap, cleanEventsMap, preds, experimentName)

            if DISPLAY_MODEL_PREDICTION_FIRST_ONLY:
                break

Results (Planned results for Draft submission)¶

When we complete our experiments, we will build comparison tables that compare a set of measures for each experiment performed. The full set of experiments and measures are listed below.

Results from Final Rubrik¶

  • Table of results (no need to include additional experiments, but main reproducibility result should be included)
  • All claims should be supported by experiment results
  • Discuss with respect to the hypothesis and results from the original paper
  • Experiments beyond the original paper
    • Each experiment should include results and a discussion
  • Ablation Study.

Experiments¶

  • ABP only
  • ECG only
  • EEG only
  • ABP + ECG
  • ABP + EEG
  • ECG + EEG
  • ABP + ECG + EEG

Note: each experiment will be repeated with the following time-to-IOH-event durations:

  • 3 minutes
  • 5 minutes
  • 10 minutes
  • 15 minutes

Note: the above list of experiments will be performed if there is sufficient time and gpu capability to complete that before the submission deadline. Should we experience any constraints on this front, we will reduce our experimental coverage to the following 4 core experiments that are necessary to measure the hypotheses included at the head of this report:

  • ABP only @ 3 minutes
  • ABP + ECG @ 3 minutes
  • ABP + EEG @ 3 minutes
  • ABP + ECG + EEG @ 3 minutes

For additional details please review the "Planned Actions" in the Discussion section of this report.

Measures¶

  • AUROC
  • AUPRC
  • Sensitivity
  • Specificity
  • Threshold
  • Loss Shrinkage

[ TODO for final report - collect data for all measures listed above. ]

[ TODO for final report - generate ROC and PRC plots for each experiment ]

We are collecting a broad set of measures across each experiment in order to perform a comprehensive comparison of all measures listed across all comparable experiments executed in the original paper. However, our key experimental results will be focused on a subset of these results that address the main experiments defined at the beginning of this notebook.

The key experimental result measures will be as follows:

  • For 3 minutes ahead of the predicted IOH event:
    • compare AUROC and AUPRC for ABP only vs ABP+ECG
    • compare AUROC and AUPRC for ABP only vs ABP+EEG
    • compare AUROC and AUPRC for ABP only vs ABP+ECG+EEG

Model comparison¶

The following table is Table 3 from the original paper which presents the measured values for each signal combination across each of the four temporal predictive categories:

Area under the Receiver-operating Characteristic Curve, Area under the Precision-Recall Curve, Sensitivity, and Specificity of the model in predicting intraoperative hypotension

We have not yet completed the execution of the experiments necessary to determine our reproduced model performance in order determine whether our results are accurately representing those of the original paper. These details are expected to be included in the final report.

As of the draft submission, the reported evaluation measures of our model are too good to be true (all measures are 1.0). We suspect that there is data leakage in the dataset splitting process and will address this in time for the final report.

Discussion¶

Discussion (10) FROM FINAL RUBRIK¶

  • Implications of the experimental results, whether the original paper was reproducible, and if it wasn’t, what factors made it irreproducible
  • “What was easy”
  • “What was difficult”
  • Recommendations to the original authors or others who work in this area for improving reproducibility
  • (specific to our group) "I have communicated with Maciej during OH. The draft looks good and I would expect some explanations/analysis on the final report on why you get 1.0 as AUROC."
    • discuss our bug where we were believing we were sampling dozens of different patient samples but were just training the model on the same segments extracted from the same patient sample over and over. so we were massively overfitting our training data for one patient's data, then unwittingly using the same patient data for validation and testing, thus getting perfect classification during inference.

Feasibility of reproduction¶

Our assessment is that this paper will be reproducible. The outstanding risk is that each experiment can take up to 7 hours to run on hardware within the team (i.e., 7h to run ~70 epochs on a desktop with AMD Ryzen 7 3800X 8-core CPU w/ RTX 2070 SUPER GPU and 32GB RAM). There are a total of 28 experiments (7 different combinations of signal inputs, 4 different time horizons for each combination). Should our team find it not possible to complete the necessary experiments across all of the experiments represented in Table 3 of our selected paper, we will reduce the number of experiments to focus solely on the ones directly related to our hypotheses described in the beginning of this notebook (i.e., reduce the number of combinations of interest to 4: ABP alone, ABP+EEG, ABP+ECG, ABP+ECG+EEG). This will result in a new total of 16 experiments to run.

Planned ablations¶

Our proposal included a collection of potential ablations to be investigated:

  • Remove ResNet skip connection
  • Reduce # of residual blocks from 12 to 6
  • Reduce # of residual blocks from 12 to 1
  • Eliminate dropout from residual block
  • Max pooling configuration
    • smaller size/stride
    • eliminate max pooling

Given the amount of time required to conduct each experiment, our team intends to choose only a small number of ablations from this set. Further, we only intend to perform ablation analysis against the best performing signal combination and time horizon from the reproduction experiments. In order words, we intend to perform ablation analysis against the following training combinations, and only against the models trained with data measured 3 minutes prior to an IOH event:

  • ABP alone
  • ABP + ECG
  • ABP + EEG
  • ABP + ECG + EEG

Time and GPU resource permitting, we will complete a broader range of experiments. For additional details, please see the section below titled "Plans for next phase".

Nature of reproduced results¶

Our team intends to address the manner in which the experimental results align with the published results in the paper in the final submission of this report. The amount of time required to complete model training and result analysis during the preparation of the Draft notebook was not sufficient to complete a large number of experiments.

What was easy? What was difficult?¶

The difficult aspect of the preparation of this draft involved the data preprocessing.

  • First, the source data is unlabelled, so our team was responsible for implementing analysis methods for identifying positive (IOH event occurred) and negative (IOH event did not occur) by running a lookahead analysis of our input training set.
  • Second, the volume of raw data is in excess of 90GB. A non-trivial amount of compute was required to minify the input data to only include the data tracks of interest to our experiments (i.e., ABP, ECG, and EEG tracks).
  • Third, our team found it difficult to trace back to the definition of the jSQI signal quality index referenced in the paper. Multiple references through multiple papers needed to be traversed to understand which variant of the quality index
    • The only available source code related to the signal quality index as referenced by our paper in [5]. Source code was not directly linked from the paper, but the GitHub repository for the corresponding author for reference [5] did result in the identification of MATLAB source code for the signal quality index as described in the referenced paper. That code is available here: https://github.com/cliffordlab/PhysioNet-Cardiovascular-Signal-Toolbox/tree/master/Tools/BP_Tools
    • Our team had insufficient time to port this signal quality index to Python for use in our investigation, or to setup a MATLAB environment in which to assess our source data using the above MATLAB functions, but we expect to complete this as part of our final report.

Suggestions to paper author¶

The most notable suggestion would be to correct the hyperparameters published in Supplemental Table 1. Specifically, the output size for residual blocks 11 and 12 for the ECG and ABP data sets was 496x6. This is a typo, and should read 469x6. This typo became apparent when operating the size down operation within Residual Block 11 and recognizing the tensor dimensions were misaligned.

Additionally, more explicit references to the signal quality index assessment tools should be added. Our team could not find a reference to the MATLAB source code as described in reference [3], and had to manually discover the GitHub profile for the lab of the corresponding author of reference [3] in order to find MATLAB source that corresponded to the metrics described therein.

Plans for next phase¶

Our team plans to accomplish the following goals in service of preparing the Final Report:

  • Implement the jSQI filter to remove any training data with aberrent signal quality per the threshold defined in our original paper.
  • Execute the following experiments:
    • Measure predictive quality of the model trained solely with ABP data at 3 minutes prior to IOH events.
    • Measure predictive quality of the model trained with ABP+ECG data at 3 minutes prior to IOH events.
    • Measure predictive quality of the model trained with ABP+EEG data at 3 minutes prior to IOH events.
    • Measure predictive quality of the model trained with ABP+ECG+EEG data at 3 minutes prior to IOH events.
  • Gather our measures for these experiments and perform a comparison against the published results from our selected paper and determine whether or not we are succesfully reproducing the results outlined in the paper.
  • Ablation analysis:
    • Execute the following ablation experiments:
      • Repeat the four experiments described above while reducing the numnber of residual blocks in the model from 12 to 6.
  • Time- and/or GPU-resource permitting, we will complete the remaining 24 experiments as described in the paper:
    • Measure predictive quality of the model trained solely with ABP data at 5, 10, and 15 minutes prior to IOH events.
    • Measure predictive quality of the model trained with ABP+ECG data at 5, 10, and 15 minutes prior to IOH events.
    • Measure predictive quality of the model trained with ABP+EEG data at 5, 10, and 15 minutes prior to IOH events.
    • Measure predictive quality of the model trained with ABP+ECG+EEG data at 5, 10, and 15 minutes prior to IOH events.
    • Measure predictive quality of the model trained solely with ECG data at 3, 5, 10, and 15 minutes prior to IOH events.
    • Measure predictive quality of the model trained solely with EEG data at 3, 5, 10, and 15 minutes prior to IOH events.
    • Measure predictive quality of the model trained with ECG+EEG data at 3, 5, 10, and 15 minutes prior to IOH events.
    • Additional ablation experiments:
      • For the four core experiments (ABP, ABP+ECG, ABP+EEG, ABP+ECG+EEG each trained on event data occurring 3 minutes prior to IOH events), perform the following ablations:
        • Repeat experiment while eliminating dropout from every residual block
        • Repeat experiment while removing the skip connection from every residual block
        • Repeat the four experiments described above while reducing the numnber of residual blocks in the model from 12 to 1.

References¶

  1. Jo Y-Y, Jang J-H, Kwon J-m, Lee H-C, Jung C-W, Byun S, et al. “Predicting intraoperative hypotension using deep learning with waveforms of arterial blood pressure, electroencephalogram, and electrocardiogram: Retrospective study.” PLoS ONE, (2022) 17(8): e0272055 https://doi.org/10.1371/journal.pone.0272055
  2. Hatib, Feras, Zhongping J, Buddi S, Lee C, Settels J, Sibert K, Rhinehart J, Cannesson M “Machine-learning Algorithm to Predict Hypotension Based on High-fidelity Arterial Pressure Waveform Analysis” Anesthesiology (2018) 129:4 https://doi.org/10.1097/ALN.0000000000002300
  3. Bao, X., Kumar, S.S., Shah, N.J. et al. "AcumenTM hypotension prediction index guidance for prevention and treatment of hypotension in noncardiac surgery: a prospective, single-arm, multicenter trial." Perioperative Medicine (2024) 13:13 https://doi.org/10.1186/s13741-024-00369-9
  4. Lee, HC., Park, Y., Yoon, S.B. et al. VitalDB, a high-fidelity multi-parameter vital signs database in surgical patients. Sci Data 9, 279 (2022). https://doi.org/10.1038/s41597-022-01411-5
  5. Li Q., Mark R.G. & Clifford G.D. "Artificial arterial blood pressure artifact models and an evaluation of a robust blood pressure and heart rate estimator." BioMed Eng OnLine. (2009) 8:13. pmid:19586547 https://doi.org/10.1186/1475-925X-8-13
  6. Park H-J, "VitalDB Python Example Notebooks" GitHub Repository https://github.com/vitaldb/examples/blob/master/hypotension_art.ipynb

Public GitHub Repo (5)¶

  • Publish your code in a public repository on GitHub and attach the URL in the notebook.
  • Make sure your code is documented properly.
    • A README.md file describing the exact steps to run your code is required.
    • Check “ML Code Completeness Checklist” (https://github.com/paperswithcode/releasing-research-code)
    • Check “Best Practices for Reproducibility” (https://www.cs.mcgill.ca/~ksinha4/practices_for_reproducibility/)

Video Presentation (Requirements from Rubrik)¶

Walkthrough of the notebook, no need to make slides. We expect a well-timed, well-presented presentation. You should clearly explain what the original paper is about (what the general problem is, what the specific approach taken was, and what the results claimed were) and what you encountered when you attempted to reproduce the results. You should use the time given to you and not too much (or too little).

  • <= 4 mins
  • Explain the general problem clearly
  • Explain the specific approach taken in the paper clearly
  • Explain reproduction attempts clearly
In [120]:
time_delta = np.round(timer() - global_time_start, 3)
print(f'Total Notebook Processing Time: {time_delta:.4f} sec')
Total Notebook Processing Time: 30851.6390 sec